Beta-Testing Service Packs Is a Tough Job
I just read Brian Moran's interview with Microsoft Vice President of SQL Server Gordon Mangione ("Securing SQL Server," May 2003, InstantDoc ID 38537) and wanted to share another reason why SQL Server support staffs don't immediately apply service packs. Just because a service pack is available doesn't mean that it's completely tested and error free for your systems. We applied SQL Server 2000 Service Pack 3 (SP3) on a server last week, for example, and immediately found that applications that used to run in 2 minutes were taking more than 55 minutes to complete. We ran and reran sp_updatestats and didn't find any improvement. Then we ran UPDATE STATISTICS table with FULLSCAN (supposedly a solution if sp_updatestats doesn't help) and again found no improvement. After studying all other possible causes, including memory settings, I finally uninstalled SQL Server 2000 SP3 (an uninstall function sure would have been nice), then reinstalled SQL Server 2000 SP2. The problem immediately went away, and we had our 2-minute response time back. You might argue that we could have tuned our applications and added indexes to solve the performance problem under SP3. Probably true. However, when I upgrade my server to the new and improved service pack, I shouldn't have to retune the performance of software that's installed at more than 100 customer sites. You can see why a DBA might be reluctant to install new service packs when they're first available. Beta-testing service packs isn't a cost-efficient endeavor.
Authentication Mode Isn't Up to Customer
Microsoft's ongoing emphasis on not using mixed-mode authentication is frustrating. If that's the company's recommended best practice, as Microsoft Vice President of SQL Server Gordon Mangione said in "Securing SQL Server" (May 2003, InstantDoc ID 38537), then it needs to do a better job of motivating third-party software vendors to support Windows authentication. A lot of third-party vendors don't currently support Windows authentication, and when you ask them when they'll add such support, your question is met with silence. We have no discretion in choosing our authentication mode—our third-party vendors make that decision for us.
Using 32-Bit Tools to Migrate to 64-Bit
In Michael Otey's April 2003 article "The 64-Bit Question" (InstantDoc ID 37779), he says, "Alternatively, you can migrate by using the Copy Database Wizard, which besides moving the database, copies your database logins to the new master database." I believe that the 64-bit SQL Server Data Transformation Services (DTS) components aren't compatible with the 32-bit components, so the Copy Database Wizard won't work.
You can use the 32-bit Copy Database Wizard and DTS components to access and copy data from the 32-bit versions of SQL Server to the 64-bit version. You can also use all your current 32-bit database-access components and applications to connect to the 64-bit version of SQL Server. However, because the on-disk formats are the same, you'd probably want to simply detach, then reattach any databases that you want to migrate.
Solution 4: Add a Type Column
In Itzik Ben-Gan's March 2003 column, "Calculating Concurrent Sessions" (InstantDoc ID 37636), Solution 4's ORDER BY clause needs to include a type column so that a session that starts in the same minute as another session ends can be counted as concurrent with that session.
You're right. You should add the type column as the last sort column in the query for Solution 4 (Listing 8):
1 AS type
SELECT app, endtime, -1
ORDER BY app, ts, type
<ASC | DESC>
Whether the type order should be ascending or descending depends on how you want to treat sessions that start and end at the same time. If you want to treat such sessions as concurrent, the type order should be descending, causing positives to come first But you should definitely avoid my original implementation, without the type column in the ORDER BY clause, because it isn't deterministic.
Another Way to Track Uptime
I saw Brian Moran's SQL Server Savvy tip "Tracking Uptime" (April 2003, InstantDoc ID 38042) and hadn't thought about looking at SPID=1 to determine how long SQL Server has been running. As an alternative solution, here's the script I use on my servers to see how long the SQL Server service has been running:
WHERE name = 'tempdb'
By the way, great magazine!
Thanks for sharing your tip! Both solutions show that it's worthwhile to understand how SQL Server works internally when you're trying to solve a problem. In this case, Microsoft doesn't provide a direct system function to tell us how long SQL Server has been running. My solution works because SPID 1 is created when SQL Server starts up, and we can trust the login_time column to give us an accurate idea of when SQL Server started. Your solution works because SQL Server recreates tempdb each time the server is stopped and started. Both solutions require knowledge of how to query system tables to retrieve information that's not otherwise available.
Drilling into Dimension-Level Security
I've read Russ Whitney's columns about dimension-level security ("Security and Parameterization," December 2002, InstantDoc ID 27040, and "Customizing Dimension Security," January 2003, InstantDoc ID 27305) and have a question. We have a cube that consists of three parent-childlevel dimensions and a Time dimension. One of these dimensions, User, contains a parental relationship, UserID to ParentUserID, where a null ParentUserID denotes the root node of the hierarchy. The cube's fact table contains information such as CostPerMonth of an application service (e.g., cost per month per user for Microsoft Word).
The cube design works well for reporting, showing level-decomposition based on the parent-child relationships. But how can we secure the cube so that users can access only the part of the hierarchy that pertains to them? For example, if John is at Level 3, his manager, Joe, is at Level 2, and Joe's manager, BigBoss, is at Level 1, John should be able to view only the facts that pertain to him and the employees he manages. Any guidance?
The scheme I outlined in the January column should work with one exception: It won't help you limit access to the higher levels of the dimensions. In your example, you want John to see his facts and his employees' facts but not his managers' facts. The only solution to this part of the problem that I'm familiar with is to create a calculated measure for each normal (loaded) measure in the cube that you want to secure. Use a formula in the calculated measure that determines whether the currently selected member (in the organization dimension) is at a level you want to secure and, if so, returns null or zero; otherwise, it should return the value from the loaded measure. You can use Analysis Manager to hide the loaded measures so that users can access only the calculated measures.
XP SP1 Bug Affects Poormon Solution
Regarding the monitoring solution in "Introducing Poormon" (February 2003, InstantDoc ID 37468), a bug in Windows XP Service Pack 1a (SP1a) causes the CounterDetails table data to be duplicated whenever a System Monitor log that's logging to SQL Server is stopped and started. The bug isn't present if you run the solution on Windows XP without SP1 applied. We apologize for any inconvenience and have informed Microsoft of this problem.