Friday, May 16, 2014

To Go or Not To Go: The SQL Server Conference Scene

As we approach the season of technical conferences, I find myself writing a proposal for my management outlining the benefits of attending them. Specifically, those that I believe to be the top two annual SQL Server technical conferences - SQL PASS Summit (PASS) and SQL IT/DevConnections (DevConnections). It’s also my goal to be in a position to attend one of these two conferences each year. To accomplish this task, I decided to showcase their benefits. This included: the variety of sessions/training offered, caliber of presenters, availability to vendors, non-session experiences, community/networking opportunities, and of course costs.

Sessions/Training:
SQL PASS Summit is designed specifically for SQL Server. It covers all things relating to SQL Server from administration, to performance tuning, to data warehousing, to SharePoint integration and beyond. The sessions are offered for experience levels ranging from novice to highly advanced/technical.

SQL IT/DevConnections is focused on development and programming across multiple technologies. The broad nature of this conference allows the training sessions include Windows management, .NET development, and of course SQL Server. Again, the sessions offered range from novice to highly advanced, however this is an expectation of basic programming and development knowledge.

Presenters:
These conferences each draw between 100 and 200 sessions. With such a high number of sessions it allows the presenters to draw from a variety of experiences and backgrounds. Past events have included Microsoft employees, technical authors, and the community’s top evangelists.

Vendors:
While the vendors the both events have a high volume of overlap, it is important to note the benefits of having them. By allowing the opportunity to see new products and features, as well as get one-on-one time with the company’s that we rely on, it also offers the ability ensure that the best products are being leveraged in-house.

Non-Session Experience:
Beyond the basic presentation session, there is still value of attending these conferences. These non-session experiences have included a discounted book store (many books of which have been written by the conference speakers), a computer lab to go through feature-specific lessons, discussion tables with subject matter experts, and even an on-site testing center to take certification exams.

Community/Networking:
Both conferences will offer the ability to network with peers in one environment that you will not likely find anywhere else. With DBA’s from around the world and community organizers and leaders within the community all in one location, these relationships can be the foundation used to bounce ideas off of, lean on for advice, as well as offer opportunities to get directly involved on both a local and national scale.

Costs:
SQL Pass Summit 2014
     Location: Seattle, WA
     Conference Date: November 4-7, 2014
Conference Cost:
     Main conference: (if paid by June 27)
     Optional: 1x Pre-conference workshop
     Optional: Online copy of all conference presentations
Travel (estimates):
     Airfare:
     Hotel: (4 nights at $150/night)
     Meals: (4 days at $25/day - breakfast and lunch provided)
 Total Estimated Cost: (with one pre-conference workshop)




$1,595
$495
$145

$500
$600
$100
$3,435

SQL IT/DEV Connections 2014
     Location: Las Vegas, NV
     Conference Date: September 15-19, 2014
Conference Cost:
     Main conference:
     Optional: Pre/post workshops
Travel (estimates):
     Airfare:
     Hotel: (5 nights at $200/night)
     Food: (4 days at $25/day - breakfast and lunch provided)
Total Estimated Cost: (with 1 pre or post conference workshop)




$1,695
$399


$600
$1000
$100
$3,794


Conclusion:
You certainly can't go wrong with attending either of these conferences. They are both going to offer opportunities beyond what you can get by reading a book or attending an online class. I hope that this shows the bigger picture of what these conferences have to offer and encourages everyone to participate in them.

Monday, March 31, 2014

Welcome to the wonderful world of being on-call: Its too early for this....

As part of a team of 7 DBA’s, each team member does an on-call rotation for one full week starting on Monday at 7:00am EST. While recapping my on-call rotation for the past week with my team, it dawned on me that it’s been a while since I've blogged, and this could be a good insight to the world of on-call for those that haven't yet had the pleasure.

The week before I started my rotation there were a lot of questions running through my mind. What does it mean to be on-call? What qualifies as an emergency vs. standard work requested off hours? How to I guarantee I can meet my SLA if I get stuck?  As I expected those questions were answered in the next 7 days.

Being on-call means that you are the first line of defense. When a problem arises, you get the page and make first contact. It’s then up to you to assess the situation and define what steps are going to be taken. During standard office hours, this included an impromptu conversation with the primary DBA and, usually, resulted in that person resolving the problem and doing the root cause analysis. This is a privilege that you likely won’t have during off hours. Assuming that problems can absolutely happen twice, I then make a point to get together with the DBA once the problem is resolved to understand what actions were taken. Should this happen again off-hours or when they are not available I already have an idea how to resolve the problem.

DBAs quickly discover there is never a lack of work requested. When on-call, it is important to identify which requests require immediate attention and which can wait. It’s also important to understand that you, the DBA, and end-user requesting the work may not agree on the level of importance. My first instinct was to look at every email during off hours, thus allowing me to assess everything. After one night of sleeping in 30 and 45 minute intervals, I realized this was not effective.  This begged the question: “What is coming to me via email and what is coming through as a page?” No! It’s not 1998 with pagers. For my company, the term page refers to a text message sent to the on-call DBA’s Blackberry. Pages are sent automatically by our monitoring tool or by engaging the emergency hotline. By reviewing and confirmation that I’m getting actionable alerts on my pager I was also confident that I didn’t have to put eyes on every email.

I’m sure every DBA is briefed on the up-time SLA’s for their environment within their first week of being hired. This is a timeline that every DBA keeps in the back of their head, as it can be the life-line our jobs live and die by. This can also be the single biggest point of stress for a DBA when something is broken. It’s also important to note that for many environments, the uptime SLA is different during critical office hours and non-critical office hours. For this blog, I’m focusing on non-critical hours. This is also where having a defined escalation policy is of the utmost importance. No one is happy to get called at 3:30am because you’re stuck, but I can assure you that it is always preferred over taking no action and dealing with the same problem during critical use hours. While no one wants to point out their short coming,  I’ve never heard of a DBA getting terminated for waking up their manager, or even the manager’s manager in the middle of the night. Unfortunately, I can’t say the same for a DBA who takes no action when action is warranted.

So what didn’t I think of? How will this impact my home life? After the second night of being on-call, it was suggested to me that if I didn’t want to be hit in the head with a pillow every time I was paged, then maybe I should find other accommodations. This was something I absolutely overlooked. But I considered how I would feel if I were the one with the pillow and no reason to be woken and decided that using the guest room during rotations was a very doable sacrifice.

While being on-call for the first time or the time 100th can be stressful. Understanding the expectations and having an action plan make it a bit less daunting.

Tuesday, March 12, 2013

You wouldn't ride a Vespa in the Isle of Man TT


For those that may not already know, I'm a bit of a gear head and love motorsports of all kinds. Of all the races around the world and from all the different categories, my favorite has to be The Isle of Man TT.

The Isle of Man TT (Tourist Trophy) is an incredible motorcycle race conducted in a time-trial format on the closed public roads of Isle of Man (between Ireland and Britain). The 37.75 miles of road consist of narrow, twisting streets and lanes all flanked by stone walls and buildings and an average course speed exceeding 120 mph and top speeds breaking 200 mph. The top speed record is 206 mph and for my money, this is by far the most exciting motorsports race there is.

So what would happen if a person entered with a Vespa? They have a tool with the means to let them ride and finish the course. But the problem is obvious. With the standard Vespa topping out at 45 mph, it’s clearly not the right tool for task at hand.

I found myself thinking about this when I was tasked with finding a SQL Server performance monitoring solution. Thus ensuring that I use the right tool for the task (and preventing me from riding a Vespa in the Isle of Mann TT). Below are the tools I decided to investigate.


I quickly eliminated Quest Spotlight based on my Quest sales rep’s recommendation,

“Foglight for SQL Server is very similar to Spotlight, however it is a web-based tool, offers a cross platform single pane of glass view if you are looking to manage other servers outside of just SQL. It also offers customized alerting, customized reports, customized dashboards and alerting based on deviations from typical server performance. So, if the above items are important to you, Foglight for SQL Server is the way to go.”

Aside from Foglight’s enhanced features, since the rep’s mentioned that Foglight is more comparable to both Idera DM and SQL Sentry PA. I crossed off Quest Spotlight.

I read the Quest Foglight installation requirements and found Foglight requires an agent be installed on each server being monitored. While this may not be a big deal for shops with a few servers, my organization has over 100 servers and a 100 agent solution is not a logistical footprint I want to manage. Quest Foglight gets the same red ink as Quest Spotlight.

That left Idera DM and SQL Sentry PA. I used four areas of focus to compare the two: Cost, Footprint, Monitoring, and Alerting.

Cost
It is important to note that all prices came directly from the company’s website. Final prices may include days of emails, hours of phone calls and my potential bartering skills (or lack thereof).

At the Idera website, I am presented with a straight purchase option per license. While there is an option to talk with a sales rep to "ask about volume pricing," no volume prices were listed.

Idera SQL DM
# of Licenses
Price
Maintenance
Total
1
$2,049.00
$409.80
$2,458.80
5
$10,245.00
$2,049.00
$12,294.00
10
$20,490.00
$4,098.00
$24,588.00

From the SQL Sentry website, I found options for single licensing as well as Quick Start 5-packs. Again I was prompted to contact a sales rep for additional volume pricing.

SQL Sentry PA for SQL Server
# of Licenses
Price
Maintenance
Total
1
$1,495.00
$299.00
$1,794.00
5
$4,495.00
included
$4,495.00
10
$11,970.00
$1,495.00
$13,465.00

**Note that the cost for 10 licenses consists of (1) Quick Start 5-pack and (5) individual licenses with (5) maintenance fees.

With a 55% of the cost; SQL Sentry has the substantial cost savings

Footprint

Comparing Idera DM and SQL Sentry PA shows they both have the same basic elements: repository database, client interface; and collection service. And neither requires an agent be installed on any monitored servers. Both services will also require local admin and sysadmin access to monitored servers.

But there are distinctive differences.

When reading about the features, Idera DM identified 3 separate services are needed for full functionality. Those services are 1) collection service, 2) management service, and 3) predictive service. Idera also relies on Windows/SQL Server clustering as a means of fault tolerance.

SQL Sentry PA, on the other hand offers full functionality with a single monitoring service. If desired, however, it also offers the able to install multiple services introducing the functionality of automatic load balancing for the monitored servers. This can help reduce the impact of each service on the host server’s resources. As well as load balancing, should any one service be stopped for any reason, the other will automatically "pick-up" the servers it was responsible for and continue monitoring them.

Multiple services can also prove beneficial when working with domains without trusts; DMZ environments; or any firewalls that have limiting access. By installing a separate service within the protected environment, the SQL Server port (1433 by default) is the only access needed.

In spite of a similar structure, SQL Sentry’s ability to scale while maintaining integrated fault tolerance, gives it the advantage once again.

Monitoring

Because both of these tools monitor much of the same performance and SQL Server information, I'm going to focus on what they do differently.

When monitoring a server with Idera DM, there are two different ways performance data is collected. The first is by the monitoring service, and the second is directly via the GUI. The monitoring service collects performance data at 6 minute intervals all the time. While this can be adjusted, Idera DM will offer a warning that it may negatively impact the server’s performance. Then when the client GUI is open, it will directly connect to the monitored server and collect performance counter data at 10 second intervals. This may cause problems should the client not have permissions to the monitored server for any reason. It is also important to note that T-SQL statement collection is not enabled by default. Idera DM does, however, offer the ability to monitor VM host stats (ESX metrics) should you be working with VM's.

When working with SQL Sentry PA, the monitoring service is the only point of data collection and it collects performance counter at 10 second intervals at all times. Also, while both products show completed queries and query related stats, SQL Sentry PA collects actively running queries as well as the associated query plans. It is also worth noting that this functionality is enabled by default. Finally, SQL Sentry PA monitors disk configuration and space allocation, as well as recently introduced monitoring for Processor Groups; NUMA configuration; and table and index size and fragmentation.

When looking at the major differences, there are a few key concepts that make SQL Sentry PA the winner. If I get a call from a user/developer that a server was previously slow, I cannot rely on data with six minute intervals to best represent the server’s activity. Plus, if someone is reporting that the server is currently slow, I want to see what is actively running to have the complete picture. Though monitoring VM metrics is convenient, I'd rather have a monitoring tool specific for OS level monitoring (like SQL Sentry PA for Windows).

Conditional alerting/responding

The alerting and responding options for both products are again similar. Both can send email/execute T-SQL/launch jobs and both have the ability to integrate with SNMP trapping.

So how are they different?

The biggest difference between these products is the ability to be alerted on performance counter threshold. This is something that Idera DM currently offers while SQL Sentry requires a license for their Event Manager product for this functionality.

If I were able to watch the dashboard for all servers throughout the day; I would be able to know when specs are beginning to get out of whack - rendering the need for threshold alerts useless. But unless you are among the lucky few to have that luxury, we rely on alerts to warn us when a server needs attention before causing large impacts to users.

While SQL Sentry is currently developing an alerting solution within the PA products, the advantage for alerting goes to Idera.

Conclusion

I am reminded that while there are many ways to accomplish a task, I’m sure we all come to the same conclusion that using the right tool can be the difference between finishing the race and winning the race.  Just as the high performance motorcycle will leave a Vespa in a trail of dust and win the Isle of Mann TT, SQL Sentry’s Performance Advisor is the clear winner for monitoring SQL Server.

Additional resources:

Monday, January 28, 2013

Change of Scenery


As I am going through a job change, I found myself thinking that this would be as good a time as any to begin blogging about my adventures (considering it might be something that others will relate to).

Let me start by rewinding to tell you a bit about myself. I started working at 14 and had a variety of jobs while trying to find my career path. My jobs ranged from grocery stores to movie rental stores; from ski shops to copy shops; from home loan processing to auto body repair estimates; all before ending up on a DBA team for a preventative health care company. This is where I was initially introduced to, and hooked on, databases and SQL Server. As you may have already guessed, I wasn't one of those people that knew what they wanted to do in high school, or college for that matter.

Once I got involved with SQL Server, I knew that I finally found something challenging and fun. I found every day brought a slightly different puzzle to solve and I loved it. It was then that I was able to turn all of this random job history and work experience into a successful Software Support Engineer position with SQL Sentry. (for those that don't know, SQL Sentry is the best SQL Server monitoring tool for DBA's to have - both in my biased opinion, as well as the un-biased opinion of most of the SQL Server community).

Fast forward three years to the present where I begin a new chapter in my career as a production DBA for a large financial company. I am no longer looking through the window to help DBA customers. Instead, I am the DBA.

While there were a lot of thoughts and emotions that came with this transition, excitement was by far the most prevalent. This excitement is also the reason that I have decided to start my blog. To share my experiences with the community; to show how reality is different than my expectations; and to show that change doesn’t have to be a bad thing.

I hope you enjoy the ride Into the mind of a DBA.