PowerPivot for Excel SQL Server 2012 RC0 is Here

Microsoft just published the RC0 (release candidate) bits of PowerPivot for Excel SQL Server 2012. Kudos for Microsoft for making the RC bits public this time. Stay tuned for an announcement for the arrival of the SQL Server 2012 RC bits soon. The RC build is a feature-complete and stable build. This will be the last pre-release build until SQL Server 2012 officially ships next year.

Here are all SQL Server 2012 RC0 links pulished so far:

Microsoft SQL Server 2012 Express RC0

Microsoft SQL Server 2012 Semantic Language Statistics RC0

Microsoft SQL Server 2012 Master Data Service Add-in for Microsoft Excel 2010

Microsoft SQL server 2012 Manageability Tool Kit RC0

Report Server Settings in SQL Server 2012 SharePoint Integration Mode

111611_0119_ReportServe1Well, rules have changed. As Jaime Tarquino from the SSRS team explained in his blog, “SQL Server 2012 Reporting Services SharePoint Integrated Mode”, there is no rsreportserver.config file anymore moving to Reporting Services 2012 in SharePoint integration mode (there are no configuration changes with native mode). Instead, the configurations sections are now saved in the SharePoint configuration database.

Problem: A customer plans to upgrade to SQL Server 2012 and SharePoint integration mode. They had the following custom renderer registered in the rsreportserver.config file that uses device information settings to customize the Excel renderer to suppress report headers and use Excel headers instead:

<Extension Name=EXCELNOHEADER Type=Microsoft.ReportingServices.Rendering.ExcelRenderer.ExcelRenderer,Microsoft.ReportingServices.ExcelRendering>

    <OverrideNames>

        <Name Language=en-US>Excel (no Header)</Name>

    </OverrideNames>

    <Configuration>

        <DeviceInfo>

            <SimplePageHeaders>true</SimplePageHeaders>

            <RemoveSpace>0.012in</RemoveSpace>

        </DeviceInfo>

    </Configuration>

</Extension>

The customer claimed that this configuration is so important that there is no way they could upgrade if this doesn’t work in SQL Server 2012. And after ensuring them that this is an easy fix, I’ve found myself spending hours to understand what needs to be done. On the positive side, I’ve picked up a few PowerShell skills along the way which may come handy given that PowerShell is omni-present nowadays.

Solution: In SQL Server 2012 (SharePoint integrated mode), you need to make such changes with PowerShell. The PowerShell Cmdlets topic in BOL is a good writeup about the SSRS cmdlets. The following script gets the job done:

Add-PSSnapin Microsoft.SharePoint.PowerShell

$apps = Get-SPRSServiceApplication

foreach ($app in $apps)

{

New-SPRSExtension -identity $app -ExtensionType “Render” -name “EXCELNOHEADER” -TypeName “Microsoft.ReportingServices.Rendering.ExcelOpenXmlRenderer.ExcelOpenXmlRenderer,Microsoft.ReportingServices.ExcelRendering” -ServerDirectives “<OverrideNames><Name Language=’en-US’>Excel (no Header)</Name></OverrideNames>” -ExtensionConfiguration “<DeviceInfo><SimplePageHeaders>true</SimplePageHeaders><RemoveSpace>0.012in</RemoveSpace></DeviceInfo>”

}

The script starts by importing the Microsoft.SharePoint.PowerShell snapi (you don’t need it if you run the remainder of the script in the SharePoint 2010 Management Shell). Then, the Get-SPRSServiceApplication cmdlet returns all the Reporting Services Service Applications available in the farm. For each application, I call the New-SPRSExtension to register the new renderer. To run the script:

  1. Remote to the SharePoint server. You must have SharePoint Farm Administrator rights to run the script.
  2. Open SharePoint 2010 Management Shell from the Microsoft SharePoint 2010 Products program group. Or, if you have installed the Windows PowerShell ISE feature, you can open Windows PowerShell ISE from the Accessories program group but make it sure to run it as administrator (right-click on it and click Run As Administrator).
  3. Copy and paste the script, and hit Enter to run it. The figure below shows the script in Windows PowerShell ISE:
  4. Do iisreset to apply the configuration changes.

For a full list of the New-SPRSExtension cmdlet parameters, you can execute the following command

Get-help New-SPRSExtension – full 

To verify that the changes are applied you could use one of the following two approaches:

  1. Execute the following script to output the configuration of all extensions:
    $apps = Get-SPRSServiceApplication

foreach ($app in $apps) {

    $extensions = Get-SPRSExtension -identity $app

    foreach ($extension in $extensions) {

echo $extension.ExtensionType $extension.Name

echo $extension | select -ExpandProperty ConfigurationXml }

2. In SSMS, connect to the SharePoint config database and run the following query. Examine the content of the Properties column to make sure that the custom extension is registered.

SELECT * FROM [dbo].[Objects] WHERE properties like ‘%excelnoheader%’     

I was tempted to try modifying directly the Properties column in the Objects table and add the new extension section but I realized I had better use of my time than troubleshooting failed SharePoint installations so I decided to stick with the supported way.

Special thanks to Prash Shirolkar from the Reporting Services team for shedding light in dark places. His blog has valuable insights about Reporting Services and SharePoint.

MDXParameter Sample

MDXParameter is a nice little utility that captures MDX queries sent from SSRS reports and replaces parameter placeholders with the actual values. I’ve been using the excellent Darren Gosbell’s Query Capture sample which is one of the first utilities I install when starting a new project. MDXParameter has a few more features that you might find appealing, including saving the queries in a database and showing the parameters separately. Installing MDXParameter is simple:

  1. Download MDXParameter from Codeplex and unzip it.
  2. MDX Parameter requires a SQL Server database to capture the queries. It could have been nice to include an option to write the queries to a file or just the screen but currently you must create an empty SQL Server database. You can name the database anything you want.
  3. Double-click MDXParameter.exe to start it. You will be prompted to specify the connection details. Note that MDXParameter will automatically create the table schema.

111111_1320_MDXParamete1

  1. Click OK to launch the main interface and click Start to start the trace.
  2. Go to your SSRS report and run it. Make sure to delete the *rdl.data file in the SSRS project folder to avoid running the report with cached datasets.
  3. Click Stop to stop the trace. You should see the all report MDX queries captured. Select a query to see its statement in the MDX tab. If the query has parameters, MDXParameter will replace them with the actual values (if the Replace Parameter checkbox is selected). This of course is the main reason why you need MDXParameter as it saves you a great deal of time if you want to execute the query with many parameters in SSMS. The tabParametersGrid shows you a grid with the actual parameter values. The Execute MDX button is currently not functioning.

111111_1320_MDXParamete2

MDXParameter supports SSAS 2005, 2008, and R2.

Stopping SSRS Parameter Refresh

[View:https://prologika.com/CS/cfs-file.ashx/__key/CommunityServer.Blogs.Components.WeblogFiles/blog/5736.Product-Sales.rdl:550:0]

Problem: How do we stop the maddening parameter refresh for reports that use SSAS cubes? As you’ve probably found, nothing seems to stop it and the Never Refresh option on the Advanced parameter properties is useless.

Solution: Use the OLE DB provider for Analysis Services for the parameter queries, as shown in the attached report.

Business Intelligence Edition in SQL 2012

To expand on my announcement about SQL Server 2012 editions, I welcome the new BI edition for the following reasons:

  1. The BI edition will include all BI features, including features that would previously require an enterprise edition, such as SSAS partitioning and semi-additive functions, as well as SSRS data-driven subscriptions. Therefore, from a functionality perspective, there will be no difference between BI and Enterprise editions as far as BI is concerned. That’s said, the Enterprise edition includes a set of database engine features not found in the BI edition, such as partitioning, ColumnStore indexes, AlwaysOn, etc.
  2. The BI edition could be a cost-effective alternative for ISVs and smaller deployments. For example, the Enterprise edition will cost you over 50K with a two-socket 4-core server. However, the BI edition will cost you $8,592 + Nx$209, where N is the number of users. So, for 20 users, it will be less than 13K. The cost convergence with this hardware configuration is about 200 users. Above that number Enterprise will be more cost effective, unless of course you decide to upgrade to say 4-socket server in which case you need to upgrade your Enterprise license.
  3. Once you’ve purchased a CAL for a user, that user can access multiple licensed SQL servers. So, if you install SharePoint and Analysis Services in SharePoint integration mode on one server, which is licensed using the above formula, and then you install SSAS in Multidimensional or Tabular mode on a separate server, you need to pay only $8,592 for the second server. You don’t need to purchase new CALs. Per-user pricing allows you to spread the implementation across the appropriate number of machines without paying a significant price.

SQL Server 2012 Licensing and Editions

Microsoft has spoken about SQL Server 2012 editions and licensing. We now have a brand new BI edition.

http://www.microsoft.com/sqlserver/en/us/future-editions/sql2012-editions.aspx

Handling Dates in BISM Tabular

[View:https://prologika.com/CS/cfs-file.ashx/__key/CommunityServer.Blogs.Components.WeblogFiles/blog/2308.AdventureWorks.zip:550:0]If you have experience with multidimensional cubes, you know that most cubes have a Date dimension grained at a day level. If the fact table has multiple date keys, you can join the Date dimension multiple times to the measure group so the Date dimension can play multiple roles (a role-playing dimension), such as OrderDate, ShipDate, and DueDate. If you use Excel pivot reports connected to the cube, to get support for dates you need to make the following changes as explained in Designing SQL Server 2005 Analysis Services Cubes for Excel 2007 PivotTables:

  1. Set the Type property of the Date dimension to Time. This is also required for MDX time-related functions, such as YTD.
  2. Set the ValueColumn column of the Date key to a column of date data type.

Once this is done, Excel “understands” the date semantics and treats dates as dates when you sort and filter on any field from the Date dimension. It also shows a nice menu for time calculations (relative dates).

103011_2341_HandlingDat1

But what if another dimension has a date attribute, such as the customer’s birth date? Unfortunately, Excel won’t recognize this field as of date type and it will treat as text. Not good, but these are the limitations of Excel OLAP pivot reports.

How does BISM Tabular change things as far handling dates? Starting with SQL Server 2012 and PowerPivot version 2, you can mark a table as a Date table. In the process of doing so, you need to specify a unique identifier. Once this is done, you get the same support for dates with Excel pivot reports because they are actually OLAP pivot reports that see BISM Tabular models as cubes. Specifically, Excel will treat only fields from the Date table as dates. Take for example the following model (attached in the blog):

103011_2341_HandlingDat2

The Orders table has three date fields (OrderDate, DueDate, and ShipDate). In SQL Server 2012, BISM Tabular and PowerPivot support multiple joins between tables, as Kasper de Jonge explains in his PowerPivot Denali: Working with multiple relationships between two tables blog. Well, kind of, because only one join can be active (the one in the solid line). The other dotted relationships can be used for DAX calculations, such as SalesAmountByDueDate, SalesAmountByShipDate, etc. In other words, BISM Tabular doesn’t support true role-playing tables and you won’t get automatically DueDate and ShipDate instances of the Date table. You can import the Date table multiple times but only one table can be marked as a Date table. However, Power View (code-named Crescent) is designed from ground up to support natively BISM Tabular models. Consequently, the Excel date limitations disappear, as shown in the table below.

Date FeaturesExcelPower View
Date typesOnly in Date tableAll tables
Relative DatesOnly in Date tableN/A

 

Power View treats any Date field as a date. For example, this Power View report shows the OrderDate and TaxAmt fields from the Order table. The OrderDate field is of date data type. As you can see, I can filter on dates after a given date and Power View shows the correct results. Unfortunately, Power View doesn’t support relative dates as Excel does so you must define DAX time calculations.

103011_2341_HandlingDat3

In short, here are some recommendations to get the most of dates:

  1. If you need to support multiple dates, leave the columns as a of date data type. Don’t use integer keys or smart keys, which are a common practice in data warehousing, to join to the Date table.
  2. Have a Date table that joins to the date that is mostly commonly used so users can aggregate by quarters, years, etc. Make sure that the date columns in the other tables don’t include times because you won’t be able to join 10/19/2011 8:40 PM to 10/19/2011. If you need to have the same support for the other dates, reimport the Date table multiple times.
  3. Consider creating inactive relationships for the other dates to be able to implement DAX calculations.
  4. Use Power View for reporting. Besides better handling of dates, Power View will give you better performance with transactional reporting, as I explained in my Transactional Reporting with BISM Tabular blog.

BTW, speaking of dates and BISM Tabular, Boyan Penev (MVP SQL Server) has implemented a nice Date Stream feed on Azure Data Market. If you decide to consume the feed, you would notice that the DateKey column is returned as ISO8601 date (such as 2011-10-29T00:00:00. Unfortunately, Excel doesn’t recognize the ISO8601 format. You might think that you can around this predicament by creating a calculated column that removes the time portion but you can use a calculated column as a unique identifier in the Date table. As a workaround, save the feed as a file and strip out the time portion so you can set the data type to Date.

Reporting Services Data Alerts

SQL Server 2012 introduces a Reporting Services feature called Data Alerts. Data alerts notify a selected list of recipients about data changes on the report. Behind the scenes, data alerts use the report data feeds feature which was introduced in SQL Server 2008 R2. Consequently, similar to subscriptions, data alerts run in the background and poll the reports for changes. For more information about the data alerts architecture, read the Data Alerts (SSRS) topic in BOL.

The steps to create a data alert for the Company Sales 2008 report follow (note that creating alerts require View Items and Create Alerts SharePoint permissions):

  1. Open the report in SharePoint.
  2. Expand the Actions menu and click New Data Alert.

    103011_0010_ReportingSe1

    Reporting Services opens the New Data Alert dialog box. The Report Data Name drop-down list shows all data regions defined on the report. In the case of the Company Sales report, there is only one data region – Tablix1. In addition, Reporting Services shows you the data that feeds the region based on the report parameters selected.

    103011_0010_ReportingSe2

  3. In the right pane, click the Add Rule button to specify criteria for activating the alert. In my case, I want the alert to be triggered for the Accessories product category, Bike Racks product subcategory, year 2007, quarter Q4, and Sales greater than 70,000. As the report data shows, currently the Sales value for these criteria is 60,883.20. As new data is coming in, this value will probably increase and I want to be notified after it exceeds 70,000. I set the alert schedule to run on a daily basis (other options include Weekly, Hourly, or Minute).
  4. Next, you specify the e-mail addresses of the recipients who will get notified when the criteria are met and the e-mail subject text. Once you click Save, the alert is saved in the Reporting Services database and scheduled for execution with the SQL Server Agent. I can see and manage the alerts I’ve created by expanding the report context menu in SharePoint and clicking Manage Data Alerts.

    Reporting Services data alerts have the following limitations:

    1. They are only available in SharePoint. Reporting Services in native mode doesn’t support data alerts.
    2. They are supported with operational reports only, such as reports created by Report Designer or Report Builder and deployed to the SharePoint. Data alerts cannot be created on Power View reports.
    3. Similar to subscriptions and report caching, data alerts require stored credentials. Consequently, developers targeting BISM Multidimensional or Tabular must go through the same hoops (none of them easy) to use alerts with Analysis Services. If you are not happy about this limitation, please vote for my suggestion on connect.microsoft.com that is time for Microsoft to address this limitation.

SQL Server 2012 data alerts extend the Reporting Services capabilities to poll reports and trigger notifications when report data changes. If you target data sources that require standard security (login and password), the feature is very easy to use and business users will probably love it.

Partition Generator for Multidimensional Cubes

Today happens to be my birthday. To reverse the tradition, how about I give you a gift? If you find yourself spending hours creating partitions in BIDS, you might find the Partition Generator sample (requires registration at the Prologika website) useful. Often, I partition larger cubes by month which may require creating quite a few partitions. Not only is this time-consuming and tedious, but is also error-prone as you can easily get the partition slice query wrong. Written as a C# console application, Partition Generator has the following design goals:

  1. Keep it simple on purpose and due to time constraints. So, no GUI yet and you need to have basic .NET coding skills. You will need Visual Studio 2008 or later with C# to configure it.
  2. Auto-generate a partition definition file in an offline mode without making changes to the server by default. You can use this file to replace the *.partitions file in BIDS so the partitioning scheme can become a part of your project.
  3. Support an online mode where you can deploy the partition changes to the server.

Configuring Partition Generator

Configuring partition generator requires opening the source and making changes to reflect your environment. Specifically, you need to update the following variables:

static string _server = “<your server>”; // SSAS server where the cube is deployed

static string _database = “<database>”; // SSAS datasbase

static string _cube = “<cube>”; // SSAS cube to partition

static string _measureGroup = “<measure group>”; // Measure group to partition

static string _dataSource = “<data source name>”; // The data source name

static DateTime _dateStart = new DateTime(2007, 1, 1); // start date for the period

static DateTime _dateEnd = new DateTime(2020, 12, 31); // end date for the period

// the partition query

static string _query = SELECT * FROM [dbo].[FactInternetSales] WHERE OrderDateKey ;

Note that you need a live cube to connect to which can be partitioned or not partitioned (default partition only). The start date and end date variables specify the start date for the first partition and end date for the last partition respectively. By default, Partition Generator will partition by month but you can overwrite this in code by changing the following line:

_dateStart = _dateStart.AddMonths(1); // assuming parititon by month

Partition Manager creates also start and end partitions for dates outside the date range which could be handy if you forget to add partitions after the end date is passed. Partition Generator sets also the partition slice which is a recommended performance-optimization technique.

Updating the Project

Once you configure Partition Generator, run it with Ctrl+F5. It uses AMO to retrieve the cube metadata and generates a partitions.xmla file in the project bin\debug folder.

  1. Double-click the partitions.xmla to open it in SQL Server Management Studio.
  2. Find and collapse the Partitions XML element. Select the Partitions element and press Ctrl+C to copy its content.

    102411_2129_PartitionGe1

     

  3. Back up your SSAS project. Open your SSAS project in BIDS. In Solution Explorer, click Show All Files.
  4. Expand the Cubes folder, right-click the <cube name>.partitions file and click View Code to open its source.
  5. Find and select the entire Partitions element for the measure group you want to partition,  and press Ctrl+V to replace the source with new partition schema which you copied in SSMS.
  6. Save the project and close BIDS. That’s because BIDS doesn’t handle well changes made directly to the source files.
  7. Open the SSAS project again in BIDS. Open your cube in the Cube Designer and click the Partitions tab. You should see the new partitions.

Deploying Partitions

By default, Partition Generator doesn’t change the deployed cube. It reads metadata only. However, you can be configured to deploy the partition changes. This could be useful when you want to experiment with different partitioning schemes, such as by month, quarter, year, etc. To configure Partition Manager to deploy the changes:

  1. In the PartitionGenerator project, right-click the PartitionGenerator project node in Solution Explorer and click Properties.
  2. In the Project Properties dialog box, select the Build tab and define a RELEASE conditional compilation symbol as follows:

    102411_2129_PartitionGe2

  3. Run Partition Manager to generate both the partitions.xmla file and apply the changes to the SSAS server.

I haven’t tested Partition Manager with BISM Tabular. It should work since Tabular uses the same Partition element in the model (*.bim) file. One more thing – Partition Manager is not supported so please use it at your own risk.

Enjoy!

Book Review: MDX with Microsoft SQL Server 2008 R2 Analysis Services Cookbook

8053.MDX%20cookbook%20200%20x%20247.png-200x0MDX is the query language for multidimensional cubes. Many BI practitioners perceive MDX to have a high learning curve probably not that much for the language itself but for the multidimensional concepts you need to master before you can get something out of it. When interacting with the community, I am frequently asked to recommend a MDX book. So far, my recommendations have been Microsoft SQL Server 2008 MDX Step by Step by Brian Smith at el for beginners and MDX Solutions by George Spofford at el for more advanced users.

I will add MDX with Microsoft SQL Server 2008 R2 Analysis Services Cookbook by Tomislav Piasevoli (MVP – SQL Server) to my recommendation list. This book takes an interesting approach that will be particularly appealing to readers who have already mastered the basics. As its name suggests, the book is a collection of recipes that you can quickly refer to when you tackle a specific MDX requirement. For example, suppose you need to obtain the last date with data in the cube. You flip to Chapter 2: Working with Time and you find a section Getting Values on The Last Date With Data. In it, the author provides the necessary background to explain the concept and provides a 9-step recipe for implementing it. Besides its practical and concise style, another thing that I liked about this book is that it targets the Adventure Works cube so you don’t have to spend time installing and learning other databases. You can hit the ground running by just copying and executing the query.

All in all, this is great book to add to your repertoire of Analysis Services books. The author is an industry-recognized expert who has many years of experience in developing BI solutions with Analysis Services and this becomes evident quickly. Get the recipes!