It’s that time again: Since 2003, SQL Server Magazine has invited SQL Server professionals to submit their most creative solutions to technical problems to the annual SQL Server Magazine Innovators contest. Our winning entries this year—one grand prize, three runners-up, and two honorable mentions—show yet again that SQL Server pros are a resourceful, technically adept bunch and emphasize the prominent position that development has in the SQL Server pro’s skill set. The solutions, though diverse, highlight SQL Server’s value as a tool for providing essential business information.
When Ermedin “Dino” Selmanovic and his eight-member development team at Moore Stephens Consulting, a UK-based firm specializing in developing custom applications for the insurance industry, proposed a business intelligence (BI) solution for a client two years ago, their real challenge wasn’t choosing the solution’s components, as The Challenge was using the components together. Microsoft provides a strong set of BI tools for SQL Server. Furthermore, says Dino, “We’re 100 percent Microsoft, in terms of the solutions we provide—which tend to be SQL Server, Microsoft Analysis Services, SQL Server Reporting Services, and the ProClarity suite of products \[which Microsoft recently acquired\].” Dino and the Moore Stephens team proposed a solution that combined all these products to provide business and analysis reporting functions for the client. “On top of that, we needed to somehow integrate Analysis Services cubes, Reporting Services, and the ProClarity products—ProClarity Desktop Professional, ProClarity Analytics Server, and ProClarity Dashboard Server—into one portal that could be accessed internally as well as by the third parties that sell the products that the client provides,” says Dino.
The integration aspect proved to be a tricky part of implementing the solution. The development team needed to write code to integrate the separate pieces of software so that each time the developers updated the solution—for example, to produce new reports or replace certain old views with new ones in response to the client’s request—all the software components could communicate with each other seamlessly.
Even more problematic was the installation itself. When Dino and his team finally had the first version of the solution ready for the client to evaluate, Dino realized that the manual process of releasing a new version of the solution was impractical. “The manual release process took five or six of us two to three days to put together,” Dino explains. “The release process required an extremely high level of coordination and control and had a risk of human error. We couldn’t afford to jeopardize all the good work we’d done in the development phase—as well as the client’s respect for the solution—by releasing the software in this way.”
The team decided to automate the process of installing new releases of the BI solution by developing an application called bIntelligent Installation Manager, which the team wrote using Microsoft .NET Framework 2.0 and C#. All the objects for the different software components that comprise the BI solution are stored in the Microsoft Visual SourceSafe version-control system. When a new release is ready for distribution to the client, the application retrieves the objects from Visual SourceSafe and packages them into an installation file on a CD-ROM, which also includes installation instructions. The Moore Stephens team then sends the installation package to the client site, and the client simply configures and executes the installation package to deploy the release.
bIntelligent Installation Manager provides two key benefits, says Dino. First, “the effort required to actually release a version of a solution is cut down to half a day for one person as opposed to two or three days for four to six people.” But the best part, Dino says, is that the bIntelligent Installation Manager ensures “the consistency of a valid release, because we minimized the human-error factor. We prepare the release, package it, release it ourselves internally and test it, and make any changes, if needed. Once we sign off on the release form, it goes out to clients, so we know that whatever we pass on to the client is valid.”
These benefits were especially notable in the solution’s early versions, when Dino’s team needed to release weekly updates in response to the client’s numerous change requests. “For the first three or four months, it was a volatile change-management environment,” says Dino. “At the time, we had something like 150 to 200 different reports and views, and every week the client was changing the existing reports, adding new reports, or removing some of the reports that they decided they didn’t need.” Over time, the client settled on a stable set of requirements for the BI reporting it needed from the solution, and the number of releases has decreased drastically.
Dino says that Moore Stephens is currently upgrading the bIntelligent Installation Manager solution to enable it to take data from various insurance-industry sectors and provide a standard set of analytical and reporting views. “We actually sent out the first version of the \[updated\] Installation Manager a few weeks ago to a customer in Oklahoma,” says Dino. The latest version of the installation solution is also customizable. “If a client, say, needs only cubes and not the Reporting Services options, we can disable Reporting Services and just package the cubes,” says Dino. “The solution we delivered was designed, developed, and delivered successfully thanks to a great team effort. The main driver was a determination to produce something that will be reusable and that benefits our customers.”
Self-Service Development Testing
For Mike Metcalf, database administration team manager for a Delaware financial services company, keeping his databases secure and available is the highest priority. Mike and his four-DBA team manage more than 60 servers (including 45 production servers) running more than 270 SQL Server, Oracle, and IBM DB2 databases. Because of the sensitive nature of the data his company works with, his team doesn’t allow developers who work for the organization to run SQL Server jobs in any of the organization’s database environments—not even on development servers. This policy lets Mike and his team keep a tight rein on data access, but it can be frustrating for developers, especially when they need to run SQL Server jobs to test the applications they’re writing.
Mike explains that about half of the applications that his company uses are developed in house, so developers are constantly writing scripts and stored procedures that they need to test. “When the developers needed to run ad hoc jobs, they had to go through one of the DBAs. My team and I weren’t always at their beck and call to run a job, which sometimes led to frustration for the developers, plus time wasted waiting and frequent demands for elevated rights.”
To solve the problem, Mike created the dbaSelfService database, which contains permitted users, the validated jobs (with ID numbers) that they’re allowed to run, and stored procedures to make it work. When a developer wants to run a job, he logs in to the database and executes a stored procedure against Query Analyzer to see a list of the jobs that he’s allowed to run. The stored procedure matches the logged-in user’s ID to its list of allowable IDs, then matches the user’s ID to the list of jobs that user is allowed to run.
When the developer chooses to run a job, the job name is put into a holding queue table. An sa-owned job runs every minute and queries the holding queue table for any waiting jobs. If the sa job finds any entries, it runs them. The user who executed the job gets an email notification when the job starts. The user can also see job information that’s written to a log table.
Because Mike created the tool strictly for use by his team, he trained developers to use it over time. When a developer would bring a job to Mike’s team to run, the DBA would show the developer the new tool and walk him through using it. The response from developers has been positive—so much so, that a couple of months ago, the team started using it for a production application. Mike explains, “One of our Web apps needed to be able to kick off a stored procedure that performs a user-initiated data load. The self-service database gave us the functionality to automate the data load.”
One of the keys to the solution’s success, says Mike, is its built-in security. “All our developers have to log in with Windows credentials, so people can log in and get access to run their own jobs but not anyone else’s jobs. To keep everyone honest, we also have an audit table to keep track of what jobs each person executes and to show that the jobs actually get executed.”
Mike enjoys his work and appreciates his team members, so he’s glad he could create a solution that makes their jobs easier. “I don’t necessarily consider myself a guru about all aspects of SQL Server, but I’m pretty handy. I try to solve things first and figure out what I need before I go looking for it. Occasionally I get lucky and find a better way to do whatever I’m trying to do.” (For more about the creativity that makes Mike an innovator, see the sidebar “Last Rites for a Beloved Server.”)
Taming the Scheduling Monster
Matt Mitchell describes himself as “network analyst by trade, and a DB developer by hobby.” But Matt’s enthusiasm for database programming has trickled down into his IT job with Alpena Regional Medical Center, where, he says, “for me every project uses a database because I love creating solutions with databases.” Matt, who has previous experience developing Web-based database solutions, jumped in to tackle the challenge of computerizing a paper-based system of scheduling logs that nurses and staffing employees at the medical center used for bidding on overtime shifts.
Until fairly recently, multiple copies of paper vacancy logs (lists of shifts that needed to be filled) were circulated around the nursing units. Nurses signed up for vacancies, and the completed logs went to the staffing office where staffing employees had to reconcile the various logs, then award overtime based on a complex set of labor-union rules. Matt sought to develop a system that made the bidding process fairer to the nurses by letting them access vacancy information from a central Web site and made it easier for staffing employees to fairly and accurately award the vacancies.
Matt’s solution consists of an ASP.NET Web application front end containing separate areas for the nurses and staffing personnel, a SQL Server 2000 database that contains vacancy information and the status of bids (awarded or not), and a simple reporting capability. “The ASP.NET Web application accesses SQL Server with ADO.NET using the System.Data.SqlClient namespace,” Matt explains. A short subprocedure in the Web application establishes a connection to the SQL Server database that contains the vacancy information and stores the connection as a SqlConnection object. Then a procedure creates a SqlDataAdapter object that uses one of the stored procedures or views included in the database as its SELECT statement. Finally, code in the application uses the SQlDataAdapter object to fill a DataSet object. “Since I’m not a full-time programmer, the nice thing about ADO.NET is that I don’t need to know a lot about the intricate details of accessing data from a SQL Server database. All I need to do is establish the connection, create a data adapter, and fill my dataset by using the data adapter. ADO .NET handles the rest,” says Matt.
The main Web page, which the nurses view, has a drop-down list for selecting the unit(s) to check for vacancies. Nurses log in to the Nursing Vacancy Log by using their Active Directory (AD) account information, then simply click a button to bid for a vacancy, and the system records the date and time of their bid. When staffing employees log in, they see a screen for setting up the schedules that will have vacancies available to the nurses, entering the vacancies, and approving the bids.
Interestingly, Matt first used Microsoft Access for the database, then switched to SQL Server. “When I begin implementing a SQL Server database design, I always start by creating an Access Project file and selecting the Project (New Data) option. Access is a nice tool for creating tables, establishing relationships, and creating the database diagram, and the familiarity of this program makes it easy for me to get started,” he says.
The third part of the solution, a reporting capability, was already available. “On our intranet server, we run SSW Access Reporter.NET for IIS, software that lets us serve up Access reports in PDF format to our intranet Web applications.” The Nursing Vacancy Log has only one report—a bid approval report. Matt designed the report in Access, then added an ASP.NET page that invokes the Access report.
Although Matt’s system hasn’t gone live yet (its release was delayed because of a job change in the staffing office), Matt received positive feedback from the staffing coordinator about his solution, and nursing staff have all trained and completed a practice exercise using the system. Even so, the benefits are evident. “The time spent reconciling vacancy logs and awarding bids becomes negligible, and nurses no longer have to hunt down a vacancy log or leave the unit to visit the staffing office,” says Matt.
Comprehensive Server Monitoring
Although he’s a DBA, 18-year IT veteran Ayad Shammout has an all-inclusive perspective on Windows technology. “I don’t want to isolate myself only on the SQL Server database, because I know that SQL Server relies on Windows, Windows relies on networking, and so on. I’ve forced myself to understand other technologies, background processes, and dependencies, so if I encounter a problem, I can easily isolate or diagnose it,” says Ayad.
Ayad’s big-picture approach to technical problem-solving served him well in developing a solution that reports a plethora of system metrics. IT staff at CareGroup use such information to gauge the health of SQL Server 2005, SQL Server 2000, and Windows servers across the organization as well as for capacity planning and auditing.
Initially, Ayad developed his system to monitor the status of CareGroup’s approximately 400 SQL Server databases and 30 SQL Server instances. “I want to go to one place and access specific information about all my databases, rather than logging on to individual servers,” Ayad says. To obtain such data, IT used to retrieve data from SQL Server system objects, such as database data and log-file sizes on all SQL Server instances, then output that data into a Microsoft Excel spreadsheet. “It took IT a long time to \[create the spreadsheet report\], and if in a month or two, we needed to get an update, we had to do that work all over again.”
Ayad began building his centralized reporting system by creating a master repository database on a SQL Server instance. The repository would store data about every SQL Server instance, database, object, and user in CareGroup’s environment. Ayad created a linked server connection to all the SQL Server instances so that the stored procedures could pull data from them and put it in the repository database. Then he extended the system to collect Windows system information from all Windows servers via Windows Management Instrumentation (WMI) and Microsoft Systems Management Server (SMS). The data-collection jobs run nightly or weekly on the servers, depending on which data they’re gathering.
After data is in the repository, IT can use Reporting Services to view the data. “We can get a complete overview of a specific user,” says Ayad. “For example, you can look up one user, and from that user drill down and see that the user has access to this particular server and database, has read-only permissions to one table, and has read-write permissions to another table.”
IT has come to depend so heavily on Ayad’s server-reporting system that he’s moving the database to a cluster server to ensure that it’s available 24 X 7, even during planned outages for maintenance. Ayad says that CareGroup’s IT has investigated third-party tools that perform similar system-reporting functions but found none as complete as his. For example, “a tool might report all the server properties but doesn’t work as comprehensively with the SQL Server databases. So we might end up running different \[third-party tools\] to get all this collected information. My solution saves money and the hassle of working with different tools. We own it and control it, so we can keep developing and adding.” In fact, Ayad is currently customizing the solution to collect data from Oracle databases on CareGroup’s UNIX servers and network data from Windows servers and network switches.
Baylor Health Care System,
Automated Medical Coding
After developing an automated procedure-coding system for Baylor Health Care System, veteran DBA Ed Bond probably knows more than many healthcare professionals about invasive-cardiac– and peripheral-vascular–procedure codes. Traditionally, medical coders manually derive the correct codes for more than 300 distinct cardiac procedures from documentation entered in a patient’s chart. Missing information often prevents coders from entering the correct codes. The hospital asked Ed to assess whether automating the coding and enhancing electronic charting could eliminate such errors.
Ed first researched charge coding, which took several months, then developed algorithm specifications for all the codes and wrote the application. Data entered into a patient’s chart via the hospital’s computerized system is sent to the server via FTP. ParserQC, the Visual Basic (VB) utility that Ed wrote, monitors the FTP folder on the server and retrieves data as it appears. ParserQC parses the information necessary to derive codes from the charted notes and temporarily stores the data in a Microsoft Access database. The utility checks for required information, derives the charge codes from information stored in a SQL Server database, produces the patient reports for the medical record, and stores the derived data in the SQL Server database.
After using the new system for more than 18 months, Baylor Health Care System has greatly improved the accuracy of complex cardiac procedure coding. Doctors like the system because they no longer need to dictate procedure reports; ParserQC derives reports from the charted information. Lab staff spend less time manually deriving charge information and have less paperwork to process. “The system has decreased costs associated with charge coding, increased revenue \[by reducing the number of rejected insurance claims\] and accuracy of patient billing, and enabled clinical staff to spend more time on patient care,” says Ed.
David Stoltz, Information
Management Programming Manager
Sacred Heart Hospital,
Allentown, Pennsylvania firstname.lastname@example.org
Sacred Heart Gets a Pacemaker
For several years, Sacred Heart Hospital in Allentown, Pennsylvania, had been having trouble with “bed flow,” the process of admitting patients, transferring them within the hospital, and eventually discharging them. The process used was intensely manual, involving many phone calls between nursing staff and administrators looking for available rooms and beds. Bottlenecks in the process plagued the system further; if you couldn’t find the person who could answer your question, you were stuck. To solve the problem, the hospital asked David Stoltz to design a solution that the entire staff could access easily.
David explains, “We wanted to design a visual solution—a ‘bed board’—that people could access through a Web-based application. Although all the patient information we needed was already in our IBM AS/400, querying that system was far too slow.”
To speed queries, David’s team used SQL Server 2000 along with Microsoft Internet Information Services (IIS) 6.0 Web server and ASP technology to design an intermediary database for processing the necessary data. David created a DTS package, stored procedures, and tables to store the patient, room, and bed information in a SQL Server database. The DTS package connects to the AS/400, creates a transaction, then queries all the needed bed data and stores it in the database. This process runs once a minute, so the Web application is always current.
The SQL Server database ended up at just 3.63MB. The front-end Web page queries SQL Server instead of the AS/400, processing more than 60 stored procedures and code in less than one second—a dramatic improvement over direct queries to the AS/400, each of which took 45–50 seconds. This improvement means the new Web-based, visual front end gives hospital staff a bird’s-eye view of the entire hospital’s bed status, so nursing units can communicate with the admissions department—and vice versa—through the application.
The solution has been a great improvement for the hospital, says David. “Beds are cleaned faster, admissions can admit patients faster, it has improved unit communication and process flow, and there’s better accountability among the staff.” You can see a sample screen from the application at http://www.shh.org/images/bbs_ss.jpg.