You can't judge a book by its cover, and you can't judge a benchmark by its raw score. On behalf of PC Magazine and eWeek, eTesting Labs recently conducted database- and application-server benchmark tests.

PC Magazine conducted a Nile-based benchmark test to compare the performance of SQL Server, Oracle, IBM DB2, Sybase, and MySQL. (Read the white paper for details about the Nile benchmark.) SQL Server finished dead last in the JavaServer Pages (JSP)-based performance test. I heard about this result from some colleagues who were upset that the test was based on a beta version of Microsoft's new Java Database Connectivity (JDBC) driver. They didn't think it was fair to judge database-server performance limited by the artificial constraints of a middleware driver. And needless to say, any test that says SQL Server can't scale receives my immediate attention. I decided to investigate the matter. Below are my findings, as well as some thoughts about the value of benchmarking in general.

I think that measuring server performance based on a middleware driver is foolhardy, so I was prepared for a belly full of righteous indignation by the time I finished the PC Magazine review (which I encourage you to read). However, I read the entire article with an open mind and found that the reviewers did a reasonably good job of making it clear that Microsoft's JDBC driver played a significant role in SQL Server's poor performance. The reviewers underscored the JDBC driver's effect by running the same implementation of the Nile benchmark under ASP.NET to see how SQL Server performed on that platform. SQL Server running on ASP.NET was markedly faster than any of the JSP competitors in this environment and displayed much better throughput and scalability numbers. (Note: PC Magazine didn't report benchmark numbers for the competing databases running under a Microsoft.NET application architecture.) Those results appear to be good news for Microsoft and suggest that ASP.NET is markedly faster than JSP and that SQL Server is markedly faster than its competitors.

So what value does PC Magazine's SQL Server benchmark have for your decision-making purposes? Like any benchmark, the PC Magazine test simply measures the performance of an application under load by using a precise set of circumstances. Mapping the numbers to your environment's performance is difficult or impossible. Extrapolating results when the performance test includes multiple layers of middleware and client software in an end-to-end test like this is even more difficult. Where does the bottleneck live--front end, middleware, or back end?

A Microsoft spokesperson says that PC Magazine didn't adhere to publicly available best practices for implementation of a scalable .NET application. For example, PC Magazine used the OLE DB .NET Data Provider rather than the Microsoft-recommended SQL Server .NET Data Provider, even though all the JSP solutions were built using vendor-recommended native drivers. Microsoft says that using the native SQL Server .NET Data Provider would have had a substantial positive impact on the benchmark result, which was already faster than any of the JSP solutions.

How fast would SQL Server 2000 have been if PC Magazine had implemented the Nile benchmark by using a full set of .NET best practices? Microsoft has such a benchmark, which you can read about on its GotDotNet Web site.

The Microsoft and PC Magazine tests are similar, but the code isn't 100 percent identical. Comparing the Microsoft Nile benchmark to the PC Magazine Nile benchmark isn't comparing apples to apples. However, this issue demonstrates the inherent problems of analyzing benchmark results.

For example, Microsoft's best throughput numbers are more than three times faster than the PC Magazine results, with both test suites running SQL Server on a four-CPU system. The Microsoft and PC Magazine Nile numbers were both based on the same design guidelines set forth in the Nile specification, and both ran the same type of queries in their workloads. Where does the performance difference come from?

Implementation decisions were responsible for Microsoft's numbers being more than three times faster, and implementation decisions always affect performance in the real world. Let's relate this fact to the question of how relevant PC Magazine's SQL Server benchmark is. PC Magazine's test of SQL Server on .NET was faster than any of the competing JSP solutions the magazine tested. However, PC Magazine's best SQL Server results were roughly three times slower than the Nile numbers Microsoft achieved on similar hardware. In essence, PC Magazine tested the performance of the JDBC driver and an application that the magazine wrote. However, the magazine didn't fully test SQL Server's performance characteristics because the benchmark relies so much on application-implementation decisions and middleware choices.

Incidentally, SQL Server recently posted a new world record on the SAP R/3 Sales & Distribution benchmark. SQL Server now holds the world record for database performance for 10 significant industry-standard benchmarks, including TPC-C. Check out Microsoft's Web site for the latest information about SQL Server benchmarks.