In last week's SQL Server Magazine UPDATE commentary, I lauded Microsoft's new TPC-C benchmark world record. But I received several letters that reminded me that many readers don't know what the Transaction Processing Performance Council (TPC) is or what its benchmarks mean.
For example, one reader asked, "Why does the TPC organization only test commercially licensed operating systems and databases? My presumptions would lead me to think that a non-profit based organization would be benchmarking anything they could get their hands on. An example being, why don't they test postreqsql or mysql on a Linux platform?"
There's a simple answer to that question: fear of getting sued. I'll get to the more complex answer to the question in a minute. First, here's some background about the TPC to help you keep things in perspective.
The TPC's mission statement says, "The TPC is a non-profit corporation founded to define transaction processing and database benchmarks and to disseminate objective, verifiable TPC performance data to the industry." The tag line on the TPC Web site says "The Transaction Processing Performance Council defines transaction processing and database benchmarks and delivers trusted results to the industry." You get the picture: They perform database benchmark tests and publish the results. The TPC currently supports four benchmark suites: TPC-C, TPC-H, TPC-R, and TPC-W. TPC-C focuses on OLTP systems, TPC-H and TPC-R focus on decision support and data warehousing loads, and TPC-W is a transactional Web-commerce benchmark designed to test end-to-end system performance. The TPC Web site provides a wealth of information about the organization and the benchmarks and includes a surprisingly interesting history page (yes, I'm enough of a geek that I honestly found it interesting) at http://www.tpc.org/information/about/history.asp .
TPC benchmark scores usually have two components: tpmC and Price/tpmC. TpmC is the number of transactions per minute for the TPC-C test, and Price/tpmC measures how much each of those transactions cost. The cost is amortized over the life of the system. You'll find more detailed information about TPC pricing scores at http://www.tpc.org/information/pricing.asp .
So let's return to the reader question above, which is essentially, "Why didn't I see a benchmark for XYZ product?" The TPC is an independent, non-profit organization. However, the TPC doesn't have the power to run benchmark tests on a database platform without the approval of the database vendor. In fact, with the exception of IBM, most major database vendors include in their license agreements a clause that forbids the publication of benchmark information without explicit permission. Here's the clause from the SQL Server End User License Agreement (EULA):
benchmark test of either the Server Software or Client Software to
any third party without Microsoft's prior written approval.
Oracle, Sybase, and Informix each have a similar clause. These clauses are generically referred to as "DeWitt clauses." David DeWitt was one of the founders of the Wisconsin Benchmarks, which were first published in the mid-1980s. At that time, the Wisconsin Benchmarks published less-than-favorable scores for an Oracle database, and Oracle wasn't happy with the negative publicity. Oracle added a clause to its license agreement forbidding unauthorized benchmarking, and most other vendors followed suit. So the answer to the first part of the reader's question above is that many benchmarks are never performed because the database vendor might not allow the results to be published. You might see unauthorized database benchmarks that other independent organizations have published. Vendors are hesitant to sue people over this clause because they know the publicity would be horrible. But technically, publishing an unauthorized benchmark could open the organization to a lawsuit from a vendor. Regardless, you won't see an unauthorized benchmark from the TPC.
Vendors use their own resources to run TPC benchmark tests, and they hire independent third parties to audit the numbers and ensure they've followed TPC rules. Only then will you see a TPC-C score on the TPC site. But anyone can submit a TPC benchmark score, as long as the vendor authorizes it. Submitting a benchmark costs only $1,250. You can read about how to submit a score at http://www.tpc.org/information/other/submit_results.asp . Of course, it could easily cost you millions of dollars to build the test environment, run the test, and have the test audited. Because of this practical constraint, I'm not aware of any TPC numbers that have ever been released unless a database vendor and hardware vendor teamed together to publish the result.
The answer to the second part of the reader's question becomes obvious when you consider the fact that vendors are the ones who decide which benchmarks they'll publish. Naturally, vendors publish only the numbers that further their strategic interests. I won't define what those strategic interests might be, but you probably won't see a TPC-C score published unless the vendor thinks it's in the company's best interest. No sane vendor would ever publish a head-to-head, apples-to-apples comparison with an existing number unless the results make the company look good. Unfortunately, most real users would prefer those head-to-head, apples-to-apples comparisons.
Is the new world record that Microsoft set accurate and true? Yes. Oracle would publish a new benchmark score in a heartbeat if they had a number that could beat it. TPC numbers are verifiably audited and the scores can be challenged, so you can trust the numbers. Microsoft is on top for the time being. TPC benchmark scores do serve a valid purpose, and you can glean a lot of useful information from them. However, interpreting a TPC score requires an understanding of how and why that score was published. Companies publish scores for marketing reasons, not to support a noble goal of providing the most comprehensive set of benchmark information available to end users.