A well-designed SQL Server 2000 Analysis Services data warehouse can yield significant information when you analyze the data warehouse cubes, and many excellent client tools are available to help you navigate OLAP cube data. Business intelligence (BI) tools such as ProClarity Analytics Platform, Microsoft Data Analyzer, and Microsoft Excel PivotTables offer visual drag-and-drop operations that let you organize dimensional data in various ways. But these tools hide the MDX queries that are behind the graphics. MDX is a dimensional query language similar to SQL but specialized for extracting data from multidimensional data sources. As you visually manipulate data in the analysis tool, the tool generates MDX to query the data source and provide results to fulfill your request.

Despite the convenience of these powerful tools, you sometimes need to use MDX to achieve a particular result. MDX can help you answer specific business questions such as, Which five regions returned the top performance last year, and who were the top customers in those regions? But for the new or occasional developer, MDX syntax can be challenging. Although MDX examples are plentiful online and in newsgroups and some good MDX books are available, MDX expressions can be difficult to learn and apply to a specific business environment, especially when you don't have examples that contain data similar to the data you're dealing with.

One way to create representative MDX queries that work with your own cube's data is to log the MDX that your analysis tool generates. Then, you can examine the MDX to learn how its syntax works. Most client-side tools use PivotTable Service to connect to Analysis Services, and you enable logging through PivotTable Service, so you can use logging with a variety of analysis tools. Enabling logging is easy. As an example, let's look at how to turn on logging to access the MDX that Excel generates when it creates a PivotTable report.

Analysis tools communicate with multidimensional data sources by using the OLE DB client-side component PivotTable Service. For example, Excel PivotTables use PivotTable Service on the client-side computer to access Analysis Services cubes as a data source. The primary way that the client-side PivotTable Service connects to an Analysis Server is by using a connection string to specify which server to connect to and the connection attributes.

In Excel 2002 (the release I used in writing this article), you specify PivotTable data sources by using either the Import External Data menu item or the PivotTable and PivotChart Wizard. You can access both of these options through Excel's Data menu. With either method, you need to define the external data source. Using the Import External Data menu item generates an Office Data Connection (.odc) file; using the wizard generates an OLAP query file (.oqy).

Regardless of the method you use to specify the data source, Excel creates a connection string that connects the client to Analysis Services. You enable logging in this connection string by adding a Log File attribute that tells Excel to accumulate MDX queries while you work with the PivotTable. Adding the Log File attribute is a temporary modification. To turn logging off, you have to remove the Log File attribute.

How to Set Up Logging


If you've never connected to an Analysis Services data source before, you need to do so before you can get a connection string to modify. For this example, let's use the Import External Data method to create a connection. From Excel's Data menu, select Import External Data, Import Data. In the Select Data Source dialog box, select +Connect to New Data Source.odc or click New Source to create a new connection and start the Data Connection Wizard that Figure 1 shows. Select Microsoft SQL Server OLAP Services. In the next dialog box, select the server you want to connect to and click Next to display the Choose Data dialog box. Select the Analysis Services database and cube to analyze—in this case, the FoodMart 2000 cube—as Figure 2 shows, and save the resulting .odc file.

Before you can add the Log File attribute to the connection string, you first have to find the connection string. Excel provides a seamless interface to Analysis Services cubes, and the data-source file location isn't obvious. The default location for .odc files is C:\Documents and Settings\<your user name>\My Documents\My Data Sources, and the default location for .oqy files is C:\Documents and Settings\<your user name>\ApplicationData\Microsoft\Queries. For this example, you need to find the file named <server name> FoodMart 2000 Sales.odc.

After locating the data-source file, you need to edit the connection string and add the required Log File attribute. Note that although these data-source files are plaintext files, they're also machine readable, and various Excel components use them. Be careful when editing the connection string, and save a backup in case you make a mistake. If the data-source file becomes unusable, you'll need to recreate the data source from Excel.

You can either edit the data-source file in its original location or edit it while you're opening the data source to use in Excel. From Excel's Import External Data menu, select Import Data. The resulting list of data sources is the combination of all data sources, including the default locations I mentioned.

In the dialog box that presents the list of data sources, right-click the data source you want and select Open With or a similar functionality to open the file in a plaintext editor such as Notepad. Now you can examine and edit the file.

Let's look at how to edit each type of file; .odc files contain XML, whereas .oqy files contain name-value pairs. If you're working with an .odc file, you can find the connection string in the XML code at the <odc:ConnectionString> element. The connection string has multiple parts separated by semicolons, and each part is a property of the connection. To add the Log File property, insert a semicolon after the last property in the Connection= keyword string (Auto Synch Period=10000) and add Log File="C:\myLOG.txt", as Figure 3 shows. (The quotation marks delimit the file location.) Excel will now log MDX activity to this local file.

Figure 4 shows a typical .oqy file. In this case, the Connection= keyword signals the beginning of the connection string. The format of the connection-string properties is the same as the .odc variety, so you add the Log File property the same way you did for the .odc file.

After logging begins, PivotTable Services logs MDX statements in the specified file after each PivotTable manipulation. Each entry's format is the same: The application name in upper case (in this case, EXCEL) is first, followed by the process ID (PID) of the Excel instance that created the query, the query date and time, the query type (MDX), and the query text. Tabs separate these items.

Different sessions of Excel activity that use the same connection string overwrite the log file, effectively clearing it out and starting over. So when you collect some good MDX examples, always copy the log file contents to another file to avoid losing the logged information. Also, only one connection at a time can own the log file. If you open another Excel instance or worksheet that uses the same log file, the log file will fail to open. Consequently, you can't start another PivotTable report on another Excel worksheet, for example. If the log file seems to sit dormant when it should be receiving updates, check to make sure that the same data source isn't already in use in another Excel worksheet.

Examining the Entries


Let's look at some typical Log File entries that I collected when I dragged some dimensions onto the PivotTable in Excel. For a data source, I used the Sales cube from the FoodMart 2000 sample database that ships with Analysis Services. I opened the Sales cube and performed several PivotTable actions. I dragged the Gender dimension to the Row Fields area of the Excel PivotTable, then dragged Marital Status to the Column Fields area, and finally dragged Sales Average to the Data Items area. Figure 5 shows the results of the actions. Figure 6 shows the resulting log entry, which describes what happened behind the scenes.

Ignoring for the moment all the extra details (e.g., time, date) that Excel tacked on to the MDX statements, notice that the statements are records of what happened during each action and that each statement can stand on its own. You need to examine only the last generated statement—it contains everything in the previous statements and more. Additionally, the last generated statement would be the same regardless of the order in which I performed the drag operations. If I had dragged the Sales Average first, then Marital Status, then Gender, the last MDX statement that Excel generated would have been the same.

Looking at this last generated statement (which I highlighted in Figure 6), you can see that Excel added lots of extra information to the MDX statement. Let's take this statement apart from the inside out. First, the DrillDownLevel() function automatically takes the set specified as a parameter and drills down to reveal that level's members. For example, in the string DrillDownLevel(\{\[Gender\].\[All Gender\]\}), DrillDown-Level() drills down from the All Gender level to the next level in the hierarchy of the Gender set and provides a set of members from that level—M and F. When you drag a dimension onto the Excel PivotTable, you don't see what happens to make those members appear. The MDX in this scenario is the same MDX that Excel uses to drill down when you double-click a dimension member in the PivotTable. But in that case, the DrillDownLevel() string that Excel generates includes an additional level parameter to show that DrillDownLevel() should return members of the specified level instead of just the next level, which is the default. You can see more DrillDownLevel() examples by experimenting with Sales cube dimensions—such as the Store dimension—that have additional levels.

While you're examining the generated MDX, copy it to the clipboard, then paste it into the MDX Sample Application that ships with Analysis Services. The MDX Sample Application is a great place to experiment with and see the results of various MDX statements. Figure 7 shows the result of my example statement. I removed most of the trimmings except for the DrillDownLevel() functions.

The next function in the generated MDX, AddCalculatedMembers(), includes in the set any calculated members of the dimension. The use of this function is interesting because a metadata-based function, such as DrillDownLevel() in this case, doesn't typically include calculated members. Perhaps the Excel designers included the call to AddCalculatedMembers() as a safety measure so that these members would never be excluded from the PivotTable output and you can have confidence that you're seeing complete results in the output.

The next layer in the generated MDX is the call to HIERARCHIZE(), an ordering function that, in this case, ensures that the members appear in the correct order, according to their hierarchy (e.g., children follow their parents in the hierarchy). This hierarchy is easy to see in the PivotTable but can be difficult to see in the MDX Sample Application.

Another interesting detail is the liberal use of PARENT_UNIQUE_NAME, which is an intrinsic member property available for a drilled-down dimension and which contains the name of the current level's parent. Excel PivotTable uses this property to correctly label parent levels in the PivotTable—for example, Store Country is a parent level for Store State. If the property is present in the query result, Excel will use it for this purpose.

Better Skills Produce Better BI


That's all there is to getting Excel to reveal the MDX queries it's using during PivotTable operations. Experimenting with the logged results of different Excel PivotTable operations can be an inexpensive and instructive way to build your MDX knowledge. Combining a good grasp of your business data with the expertise to manipulate cube data using MDX in ways that go beyond what the tools offer can give you a leg up when it comes to producing better BI information from the data you already have.