In Part 1 of the series, I set-up a SQL Azure data mart and used SSIS to extract, transform & load (ETL) data from on-premises SQL Server 2008 R2 to a SQL Azure database. Azure allowed me to move my data into the cloud using the same tools that I use today for traditional SQL Server and I required no schema changes as well. In Part 2, I will begin performing some ad-hoc analysis on that Azure cloud database with PowerPivot.
At this point in the project, as I perform ad-hoc self-service business analysis using cloud data in my SQL Azure database, let me just point out to you that I (IMO) see this as a hybrid approach. As it stands today, I cannot create a PowerPivot cube or an SSAS cube in the cloud. But I can use Excel 2010 with the PowerPivot add-in to run powerful analysis on that data which is in the cloud and do so without needing any local on-premises infrastructure. No data marts, data warehouse, database of any kind. I am going to use a direct connection to SQL Azure from PowerPivot for Excel 2010.
Let’s take a quick look at the AdventureWorksDW2008R2 tables from SQL Azure that we’re going use. You can view them from your SSMS by connection via Azure:
Or you can connect using the developer-friendly schema viewer and management tool specifically SQL Azure:
To build a PowerPivot solution from SQL Azure, you launch Excel 2010 and select the PowerPivot add-in. From the PowerPivot window, you will select “From Other Source” and then choose “Microsoft SQL Azure” from the data source connection wizard:
After I’ve imported the data from SQL Azure cloud database into my local PowerPivot in-memory analysis cube, I will create pivot tables and end-up with a sales report from the history in the data mart that can allow me to drill-down into years, products, reseller and Internet sales:
To create the model to make that analysis possible, I’ve brought in these fact tables: FactInternetSales, FactResellerSales and these dimension tables: DimDate, DimProduct, DimSubcategory and DimCategory:
If you expand out that screenshot above, you’ll see that I gave user-friendly names to the tables and some of the fields. You can do this in PowerPivot when you import the data or you can modify it right in the PowerPivot workbook window. I left the default behavior of the PowerPivot auto-detection for relationships between the tables which worked just fine for me in this case.
Now you can have your Excel users running complex business analysis and building self-service BI dashboards with PowerPivot for Excel from SQL Azure, with all of the data in the cloud data mart. In part 3, I’ll continue this series with using the new CTP (beta) for SQL Azure Reporting Services, where I’ll show you how to create scorecards and reports for Microsoft Cloud BI without needing the Reporting Services infrastructure on-premises.
UPDATE: If you want to replicate this on your own SQL Azure database, you can use the entire free sample AdventureWorksDW2008R2 database for SQL Server 2008 R2 and migrate it to your SQL Azure instance using the free SQL Azure Data Migration Wizard. If you do not yet have a SQL Azure account, MSDN subscribers have access to Azure accounts and the AdventureWorks samples databases will fit easily in the smallest SQL Azure database max size of 1 GB.