Two weeks ago, the SQL Server Magazine Web site ran an Instant Poll that asked, "Do you pay attention to TPC benchmark scores?" Eighty percent of respondents said, "No, we don't use TPC information." That result isn't surprising. Many people don't think the test environments that vendors create accurately reflect how customers use the products in their businesses. One reader expressed this opinion, which many of you probably share:

"It is nice to know that SQL Server is capable of those kinds of performance numbers. But could you break it down to actual usage statistics—a more realistic installation of a database server with, say, two or four processors running a business application that supports up to 500 users and performs a few hundred transactions per minute? How would Microsoft compare with Oracle or DB2 in that kind of test? High-end TPC numbers are interesting to read, but numbers that compare to the real world would be even more interesting."

As I mentioned in last week's commentary, it's unlikely that you'll ever see apples-to-apples comparisons of real-world benchmark numbers coming directly from the vendor community. No sane vendor would compare its product to that of a competitor using a typical business system. In such a comparison, there would be a "winner" and a "loser." Database vendors are pretty smart. They all have the ability to run private benchmarks to try to beat a given score. But vendors publish numbers only when their products are the winners. When a vendor's tests provide an unfavorable result, that vendor simply chooses not to publish the score. However, you can draw some interesting conclusions if you think about the benchmarks that aren't published.

For example, I visited the Transaction Processing Performance Council (TPC) Web site last Friday and sorted all the TPC-C scores by total system price in ascending order. In other words, I sorted the TPC-C results by how much the system actually cost to build and focused on the least expensive systems. SQL Server 2000 had the 49 least expensive TPC-C scores, measured by total system cost. These 49 scores were split pretty evenly across single-CPU, dual-CPU, and four-CPU solutions.

Microsoft also had 49 of the top 50 TPC-C scores measured by Price/tpmC, which is a price-performance ratio. (You can download all the TPC-C scores in .xls file format at http://www.tpc.org/information/results_spreadsheet.asp .) As of March 13, the TPC site had 104 active results. Microsoft was the only vendor to list a TPC-C result for any configuration with fewer than four processors, and of the 104 active scores, Microsoft had posted 30. The site listed a total of 30 scores for servers that used four CPUs. Oracle had three scores in this category, Sybase had two, and Microsoft had the remaining 25 scores. Don't most of you run systems that range between one and four CPUs?

You might not find an apples-to-apples comparison on real-world platforms, but the TPC-C numbers—or lack of those numbers—in the one- to four-CPU space speak loud and clear. Perhaps other vendors simply aren't interested in publishing scores based on real-world server configurations. That's entirely possible. Or perhaps other vendors haven't published scores in that price range because such scores wouldn't put their products in a favorable light. I'll let you draw your own conclusion.

In last week's commentary, I also mentioned the DeWitt clauses that most major database vendors (except IBM) include in their End User License Agreements (EULAs). A DeWitt clause says you're not allowed to publish a benchmark number unless the vendor gives you permission. Most people tend to be strongly against rules that suppress the free flow of information, so I expected many comments about these clauses. But alas, just one reader shared his views. So I'll be more explicit this week: What's your opinion of the DeWitt clauses that prevent independent people from running their own benchmarks and publishing the results? This topic isn't as simple as it might seem, and I'll explain why in a future commentary. But first, I'd love to hear your opinion.