Editor's Note: Send your SQL Server questions to Richard Waymire, Microsoft's lead program manager for SQL Server management tools, at email@example.com.
How can I find out the maximum row size for a table in which all the varchars are fully populated?
The only way to find a SQL Server table's maximum row size is to query the system tables (or the INFORMATION_SCHEMA views) and add up the byte counts. But note that Microsoft doesn't recommend querying system tables, which can change from release to release. Also note that the total row size never exceeds about 8060 bytes in SQL Server 2000 or 7.0.
We want to upgrade our SQL Server 7.0 servers to Service Pack 2 (SP2). However, we have a Data Transformation Services (DTS) job that copies a table from one of our servers to a business partner's server that runs SQL Server 7.0 SP1. To run the DTS job properly, should the business partner upgrade its server to SP2?
Your partner can keep SP1 when you upgrade to SP2, but only if you password-protect all DTS packages that pass between servers with different service packs. Microsoft has corrected the incompatibility problem in SQL Server 7.0 SP3.
I want to schedule a Visual Basic (VB) 6.0 executable (.exe) program by using the SQLAgentCMDExec account to create a job through SQL Server 7.0 Enterprise Manager. The .exe program should use Universal Naming Convention (UNC) paths to create a file on a shared directory on the SQL Server 7.0 machine. The .exe file works fine when I run it from the server by double-clicking it or by opening a command prompt. However, it doesn't work when I run it from Enterprise Manager. The job's account owner has full access to the shared directory on which I want to create the file. How can I get the executable to work from Enterprise Manager?
The job's security depends on the job owner's access privileges. If the job owner is a SQL Server systems administrator (sa), the service account that SQL Server runs under also runs the .exe file. If the job owner isn't an sa, SQL Server uses the SQLAgentCMDExec account as the security context to run the job. Therefore, the SQLAgentCMDExec account needs the rights to do the same tasks that your executable does. The job won't run with the owner you specified because the SQLServerAgent service can't log in as that user without using a Windows NT password, which SQL Server doesn't support.
I heard that Microsoft is going to stop supporting SQL Server 7.0 soon, so we should upgrade our SQL Server 7.0 machines to SQL Server 2000. Can you confirm this rumor, and if it's true, when will the company end SQL Server 7.0 support?
Microsoft supports the current SQL Server release and the release that preceded it, including providing fixes. The company continues to support earlier releases but doesn't provide patches to any new problems that users find. So, SQL Server 7.0 support will continue until SQL Server's next major release comes out. However, Microsoft released SQL Server 2000 only in September 2000, so the company will continue to support SQL Server 7.0 for quite some time.
I need to create two new columns in my Customer table—one that holds an order count and one that holds the last order date. However, the following UPDATE statement, which I tried to use to set these new columns, doesn't work:
UPDATE CUSTOMER SET cust_oqty = cust_oqty + 1, cust_lastod = order_date FROM ORDER JOIN CUSTOMER ON cust_id = order_cust_id ORDER BY order_date
When I use a SELECT statement to test the join on a customer's three orders, the query returns three rows. However, when I run the UPDATE statement, the Customer table's order count is 1, not 3, and the last order date is correct from the last query row. Another option is to use two subqueries in a SET clause, using COUNT(*) and MAX(order_date) for each column, as Listing 1 shows. Do I have other choices?
Creating the two new columns would be easier with a cursor than with the UPDATE statement. You can open a cursor on the table and loop through each row, updating the counter after each update. This process is slower than performing the operation in one statement but is much easier to program.
The speed of our nightly tape backups on a 50GB SQL Server 7.0 database varies widely. For example, for 21 days, the backup speed was 5MBps. Next, for 85 days, the speed dropped to 2.5MBps. The speed rose again for 106 days, fell for 2 days, then peaked for another 34 days. The fluctuation is frustrating because every time the speed drops, the backup schedule goes out of control and affects other jobs. I don't see a pattern. Why are we getting the variation?
What environmental changes might be causing the fluctuation? Nothing in SQL Server explains the behavior, so look for factors that affect the read throughput of the disks or the write throughput of the tape drive. You can use BACKUP to disk='NUL' to test maximum read throughput separate from the tape write speed. You can also try using RESTORE VERIFYONLY to see whether the backups have been corrupted.
Upgrading my databases from SQL Server 6.5 to 2000 takes more than 13 hours when I use the Upgrade Wizard. However, I trimmed upgrade time to 5.5 hours by executing SQL Server 6.5 bulk copy program (bcp) out in native mode from SQL Server 6.5 to 2000. Before I began converting the SQL Server 6.5 database, I dropped all indexes except clustered indexes. I don't want to lose any data integrity or precision with datetime, numeric, or character data. Can upgrading my databases this way cause a data-conversion problem?
Upgrading your databases to SQL Server 2000 by executing SQL Server 6.5 bcp in native mode should work fine. Just make sure that bcp copies all the data, and also make sure that you move all the other relevant system state information that the Upgrade Wizard captures, such as logins, users, security settings, custom error messages, stored procedures, triggers, and constraints.
Do I need to follow special installation procedures before I upgrade my SQL Server 7.0 database in a Windows 2000 cluster to Service Pack 2 (SP2)? For example, do I have to uncluster before I upgrade, then recluster?
The SP2 readme.txt file gives you instructions. Yes, you have to uncluster, apply SP2, then recluster.
I wrote a T-SQL script to load 80 characters from a text file into what I thought was an 80-character, fixed-length comment column. Then, I found that the column was 220 characters long. How can I write a script to append characters 81 to 220 to the comment column?
The kind of script you need to write depends on the column definition. If it's a fixed-character column, the column is already padded with spaces. If it's a varchar column, it isn't padded. Listing 2 shows the script that appends extra characters to a fixed-character column. Listing 3 shows the script for a varchar column.
My complex SELECT statement includes a NOLOCK hint for each table. However, when I run sp_lock to review the locks taken, SQL Server takes lock type TB Sch_S on all tables. How can I remove all locks?
You can't control schema stability locks because they're strictly internal to SQL Server. You need these locks to prevent someone from changing the table's schema while you're executing a query.