Editor's Note: SQL Server Magazine welcomes feedback about the magazine. Send your comments to firstname.lastname@example.org. Please include your full name, email address, and daytime phone number with your letter. We edit all letters and replies for style, length, and clarity.
Looking at Key Size
Michelle A. Poolet's Solutions by Design: "Surrogate Key vs. Natural Key" (February 2002, InstantDoc ID 23449) was a great read for both DBAs and data modelers. However, the article seemed to focus on the physical size of primary keys rather than the core purpose of primary keys: to uniquely identify records. From a data-modeling perspective, the last thing I think about is a key's size and the possible impact of that size on the underlying physical database implementation. I've always listed the possible primary keys, then looked at usefulness (a simple identity column versus other primary key combinations) and the complexity of intertable joins before considering column and key size. However, I agree that key size is important at a pure performance (or storage) level in the physical implementation.
You make some excellent points. I've written many columns about the importance of primary keys and how to choose them wisely. I figured my readers were getting tired of hearing the same old thing, so in February, I took a different tack and looked at the implications of key size. Primary key size is especially important now because of the rearchitected index structure in SQL Server 2000 and 7.0.
I appreciate your comment about the primary concern of data modelers. In an environment where data modelers are distinct from DBAs, the data modeler's primary job is to accurately and fully represent the business's data requirements. The data modeler then passes the database design to the DBA for physical implementation, which is when things like taking the physical size of the primary key into account happen. The DBA who does the physical implementation might even be the best person to decide which columns to cluster the file on because that person has (or should have) intimate knowledge of which queries run most often and which tables are hardest hit. However, most small to midsized enterprises don't have separate data modelers and DBAs; so this article was for the people who have to wear many hats to keep a database environment running strong.
Amen to Brian Moran's SQL Server Magazine UPDATE commentary "Microsoft Helps Solve Information-Overload Problem" (December 6, 2001, InstantDoc ID 23430). As a product manufacturer, Microsoft is responsible for documenting its products' components, commands, and so on so that customers can use the products successfully. However, Microsoft does itself and its user communities a great service when it goes beyond rote documentation and adds a lens of experience through which to view and use its products. We don't use SQL Server and other Microsoft technologies in a vacuum. Thumbs up to any initiatives that help Microsoft pass on product expertise, discuss product use in the real world, and enhance SQL Server customers' knowledge and skill sets. Anything that results in more effective use of the product will result in a bigger market share for Microsoft. This strategy is a winner for everyone but the competition.