Dowload the Code iconPerformance is always a concern, but unless it's a problem, it's rarely a priority. That's why so many database applications run into performance problems—most applications are built around a "get it done yesterday" schedule and performance concerns are secondary to release. In fact, in most cases, all a developer focuses on is how an application works for a single user on a high-end developer machine. "It works on my machine" is the response to most bugs that emerge just prior to, during, or shortly after deployment.

However, performance is anything but a new issue, and over the years more and more resources and tools have been created to help quickly identify performance bottlenecks. Visual Studio (VS) 2010 includes the latest version of one such tool, the VS Performance Profiler. This feature-rich tool includes several new capabilities, including some that are specific to checking the performance of your ADO.NET data access. This article will provide an introduction to this tool and then walk you through using it to debug a couple of specific performance scenarios.

To follow along with the examples discussed in this article, you'll need a version of VS 2010 above the Express level (Professional, Premium, or Ultimate). The analysis tab is available in all of these versions. To use the analysis tools, you'll need an application to test. There's a simple application solution associated with this article with VB and C# versions available for you to download (click here). The sidebar "Create a Sample Application for the Performance Profiler to Test" describes loading or creating the solution. This article is going to gloss over creating the sample application and mainly look at testing this code's behavior.

These samples were developed against a local SQL Server 2008 R2 database and tested with SQL Server 2008—your results may vary with other database versions. However, the version of the database isn't a dependency for using the performance profiling tool. The provided examples use the standard Adventure Works sample database available for download from CodePlex at msftdbprodsamples.codeplex.com/. Download the example code from the "Download the Code" link at the top of the page.

The Profiler

The profiler has enough features and capabilities that I could write a book chapter and still not cover them all. So it's important that I start by looking at some of the key features and defining some of the areas that won't be explored in this article. At its core, this article is aimed at introducing database developers to some of the tools that are available for performance monitoring, so anything dealing with the high-end High Performance Cluster testing and elements related to parallelism is outside its scope.

Your first step is to go to VS 2010 and start it with administrator privileges. (Note that it's possible to performance test applications using the command line, but I'll be working from within VS.) VS should be started with Administrator privileges because to use the performance tools, you'll access system level information that's protected. The sample application used for this article is a Windows Forms application, but you can create your own. The Performance Profiler can be used with both ASP.NET and WPF applications (but not Silverlight applications) and can even be configured to profile JavaScript functions, but I'll focus on ADO.NET.

Having opened or created a new Windows Forms application, you now have access to the Analyze menu within VS. The first item is Launch Performance Wizard…, which opens the screen in Figure 1. This initial screen is useful as a starting reference because it illustrates the top level options for performance profiling. Before I move on I'll describe the top level focus of each of these options.

If you're familiar with the VS 2008 version of the Performance Profiler, you're probably aware that it didn't support virtual environments—you had to be running on the native OS to use sampling. One of the enhancements with VS 2010 is that you can conduct CPU sampling tests on applications running in a virtual environment. The key to sampling, however is that your application runs unchanged and the profiler simply monitors the CPU to track performance while your application runs. This is the recommended performance modeling start point because it most accurately tracks your overall impact on the system.

The second option, Instrumentation, works by taking your compiled code and adding special calls to the profiling engine to track the time when each method is entered and exited. While it may not be the best place to start for your profiling, it's an excellent tool for narrowing down a specific method that might be a performance bottleneck.

The .NET Memory Allocation option allows you to track an application's memory use over time. It also uses a sampling engine, so it doesn't instrument your codes to track exactly when memory is allocated. The idea is to allow a developer to spot an application that's consuming ever-increasing amounts of memory while running. Although memory leaks are now limited by the .NET garbage collector, an application can by maintaining unrecognized top-level references to objects and thus perform in a manner that looks similar to an application leaking memory.

The final option at this top level is Concurrency, which is focused on thread management. As I noted, I'm not going to focus on concurrency issues, but this tool targets those issues. Most race condition type errors occur when an application is under stress and can then be difficult to recognize—by combining performance profiler with a stress test you can get a picture of any thread related issues.

At the bottom of the screen Figure 1 shows is an important link. Following this link takes you to MSDN, which discusses more about the profiling methods supported by the performance profiler. The web page also introduces one of the key areas that is a focus of this article, ADO.NET Tier interaction.

The Launch Performance Wizard's first page

Figure 1: The Launch Performance Wizard's first page

Profiling Data Tier Interaction

As noted on the page linked by the wizard and available in the MSDN article "Understanding Profiling Methods" at msdn.microsoft.com/library/dd264994.aspx, there's another profiling method called tier interaction. This type of profiling runs in parallel with either sampling or instrumentation and allows you to collect performance information on your SQL Server database calls. When you enable this type of profiling for a test, the profiler will capture how much time your application spent in the database and how often it went there. It also provides information related to the minimum, maximum, and average time spent on your queries and provides a detail view that lets you get these same basic measurements on a per-query basis.

To get to where you can access this type of profiling, you need to complete the wizard. It doesn't matter which of the four main profile methods you select in the wizard. The next screen allows you to select from an available project, and by default your currently loaded project will be selected. Select Next on this screen and finish on the next screen to continue and complete the Wizard. Once you complete the wizard you should see the Performance Explorer, which Figure 2 shows. This window will open in VS once you've completed the performance wizard. It lets you manage the performance session. Figure 2 shows a performance session where the profiler has already been run once. The perfTest session shown in Figure 2 is a persistent item that can be run repeatedly, keeping track of each set of results and the current settings for your performance profiling.

Figure 2: Performance Explorer