Many people overlook SQL Server Profiler in SQL Server 2000 and 7.0 as a Web development tool. Because one of Profiler's functions is to monitor and report the various performance levels of your SQL Server database, you can use it to look at how your Web applications interact with the server. Furthermore, Profiler can help you analyze your code's efficiency. For example, you can use Profiler to determine whether your Web application would perform better if you used Query Analyzer to optimize the code.
To generate performance numbers for Profiler to use in analyzing your Web application's communications with SQL Server, start by testing a few pages. Choose a testing tool that can place a real-world load on your application, then run Profiler to watch this load. You can download the latest Web Application Stress Tool, for example, from http://webtool.rte.microsoft.com. Application Center, a new Microsoft product now in beta 2 testing, also includes an updated version of this tool.
When you run traces with Profiler, save the results under easily identifiable names. After you have conducted several tests, you can then return and compare the results at each step. You need to launch a new trace after each run and save the trace every time. You can also save traces and replay the load later to simulate the real load on the database. Keep in mind that a load you replay with Profiler or Query Analyzer only simulates a load, which is different from applying the load through your Web application. Here are four simple tests I conducted to illustrate the use of Profiler.
Test 1: Running a simple query in ASP script. First, I wrote the simple ASP script that Listing 1 shows: Database.asp, which includes the RunWithRS function. Then, I wrote the script in Web Listing 1 (see the More on the Web box, page 74, for download information), FirstData.asp, which calls the RunWithRS function and runs the following query:
where ckey1 = 'a'"
I ran the FirstData.asp script on two Web servers. Server 1 was a Windows 2000 workstation with a 600MHz Pentium CPU and 192MB of RAM, and server 2 was a Toshiba Tecra (133MHz Pentium, 144MB of RAM) running Win2K and also running SQL Server 7.0.
I started Profiler, and after setting the trace properties I began a trace. In the trace results (which you can see in Web Figure 1), the SQL:BatchStarting line specified the start of the SQL batch that ran from server 1. The SQL:BatchCompleting line showed the completion of the batch. In test 1, durations were identical. The result was what you would expect, because accessing the database from either system should report the same response times. The only difference in the tests was the Web server, which should not affect the database execution.
Test 2: Running stored procedure FirstData2.asp. In the second test, I created a stored procedure from the SQL statement:
SELECT ckey1,col2 FROM testtable WHERE ckey1 = @ckey
Then I ran the stored procedure by changing the sSQL parameter to
Test 2 resulted in a duration of 1443ms, a finding similar to the 1494ms recorded for the dynamic SELECT statement in test 1.
Test 3: Using the ADO Command object FirstData2.asp. For test 3, I used the ADO Command object and created a parameters collection for the stored procedure. Listing 2 shows the function containing the ADO code. I started a new trace and saved the results to a file called trace1.trc. Running FirstData2.asp, the .asp script that has the ADO code with the parameters collection, produced durations from 1370ms to 1600ms. Then, I stopped the trace.
Test 4: Using the Index Tuning Wizard. In the final test, I started Profiler's Index Tuning Wizard from the Tools menu. To use the Index Tuning Wizard, on the first screen click Next. On the second screen, select the database you're testing, then click Next. On the third screen, leave I have a saved workload file selected, then click Next. (You can save a trace or the SQL as a workload file.) On the next screen, select your workload file by clicking the My workload file button, browse to your file, then click Next. And on the following screen, select only the table you are testing (miscdata), then click Next.
The Index Tuning Wizard then ran the load that I captured from my trace against the database and analyzed the load. When the wizard completed its analysis, it suggested the index that Figure 1 shows. That screen also displayed the estimated performance gains from the index. One analysis showed an estimated 80 percent performance improvement. On the performance data screen, I clicked Next. On the next screen, I clicked Apply changes, then clicked Next. On the final screen, I clicked Finish to apply the index recommendations.
When I ran the tests again with the new index, the Index Tuning Wizard improved the processing performance for each page. However, the improvement was about 25 percent. The results were the same for both the stored procedure and the dynamic SQL in test 1. The CPU column was even more interesting. The CPU numbers recorded before the index change ranged from 410ms to 441ms; after the index change, the CPU numbers dropped to 270ms for dynamic SQL and then to a range of 210ms to 240ms in the stored procedure. You can see that placing one index on the table dramatically improved the performance and reduced the CPU load on the server.
A Powerful Investigative Tool
I've addressed a fairly narrow area of analysis in this article. You can generate initial performance numbers by testing with a few pages, as I did. When you're ready, you can use Profiler to dig deeper into your Web application and watch the activity under a magnifier. For example, you can capture various other important statistics during each step that SQL Server takes to process your SQL statement or stored procedure. And remember that applying recommended indexes to your database can improve performance significantly. For more information about Profiler, see Itzik Ben-Gan, "Trace That Event with SQL Server Profiler," April 2000; Paul Burke, "Using SQL Profiler," July 1999; and Brian Moran, "What? You Still Aren't Using SQL Profiler?"