Posts

SQL Server 2008 R2 November CTP (CTP3)

Microsoft announced release on SQL Server 2008 R2 November CTP (CTP3) with a build number of #1352.12. MSDN, TechNet Subscribers and TAP customers can download the SQL Server 2008 R2 November CTP today. Non-subscribers will be able to access the CTP on Nov. 11th. This should be a feature-complete build that includes all SQL Server R2 functionality.

Reporting Services

The new Reporting Services features include:

SharePoint Integration

SharePoint Integration has several new features that include the following:

  • Support for multiple SharePoint Zones.
  • Support for the SharePoint Universal Logging service.
  • A new data extension.
  • A query designer for SharePoint Lists as a data source.
  • Support for right-to-left text with Arabic and Hebrew in the SharePoint user interface.
  • In a report, the data extension supports include data from SharePoint lists for SharePoint Foundation 2010, SharePoint Server 2010, Windows SharePoint Services 3.0, and Office SharePoint Server 2007.
  • Report Parts.

Shared Datasets and Cache Refresh Plans

Shared datasets are a new type of report server item that can retrieve data from shared data sources that connect to external data sources.

Cache refresh plans let you cache reports or shared dataset query results on first use or from a schedule.

New Data Visualization Report Items

The November CTP introduces several new report items that depict data:

  • Sparklines and data bars are simple charts that convey a lot of information in a little space. These are often used in tables and matrices. Their impact comes from viewing many of them together and being able to quickly compare them, instead of viewing them singly.
  • Indicators are minimal gauges that convey the state of a single data values at glance.

Enhancements to Reporting Services Features

The November CTP provides the following enhancements to Reporting Services features that were released in earlier SQL Server 2008 R2 CTPs or SQL Server 2008:

  • Configuring a Map Layer—The Map wizard displays data from both the spatial data source and the analytical data source to help you choose the field on which to match.
  • Calculating Aggregates of Aggregates—You can create expressions that calculate an aggregate of an aggregate.
  • Enhanced Support for Expressions—New globals and a new property for report variables provide support for overall page numbering, naming pages, and information that is specified in the renderer.
  • Rotating Text 270 Degrees—Text boxes can now be rotated 270 degrees.
  • Report Pagination—Page breaks on tablix data regions (table, matrix, and list), groups, and rectangles give you better control of report pagination.
  • Naming Excel worksheet tabs—You can set properties on the report to name worksheet tab when you export the report to Excel.

Business Intelligence Development Studio Support for
SQL Server 2008 Reports and Report Server projects

Business Intelligence Development Studio supports working with both SQL Server 2008 and SQL Server 2008 R2 reports, and with Report Server projects in the SQL Server 2008 R2 version of Business Intelligence Development Studio.

New Web Service Endpoint

The report server includes a new management endpoint named ReportService2010.ReportingService2010. This endpoint merges the functionalities of both the ReportService2005.ReportingService2005 and ReportService2006.ReportingService2006 endpoints, and can support management operations of the report server in both native mode and SharePoint integrated mode. It also includes new features that are introduced in SQL Server 2008 R2, such as shared datasets and cache refresh.

Analysis Services

The Analysis Services new features are centered around Gemini and include:

PowerPivot for Excel

PowerPivot Field List Enhancements

Numerous enhancements have been added to the PowerPivot field list. These include the following:

  • Automatic detection of relationships when columns from unrelated tables are used in a PivotTable.
  • Support for searching fields by name.
  • Support for named sets.

PowerPivot Data Source Editing

Support is available for the following data source editing tasks:

  • Refreshing data for table, for all tables from a data source, or for all the tables in the PowerPivot window.
  • Editing data source information such as server name, database name, and credentials.
  • Editing table settings such as data source table or query used, columns imported, and filters.

Support for More Types of Data Sources

Targeted support has been introduced for importing data from the following sources:

  • Microsoft Access 2003, 2007, 2010
  • Microsoft SQL Server 2005, 2008, 2008 R2 Analysis Services
  • PowerPivot workbooks published to Analysis Services running in SharePoint Integrated Mode
  • Text files
  • Excel 97-2003, 2007, 2010
  • Microsoft SQL Azure
  • Oracle 9i, 10g, 11g
  • Teradata V2R6, V12
  • Sybase
  • Informix
  • IBM DB2 relational databases 8.1

Enhanced Data Analysis eXpressions (DAX)

DAX functionality has been significantly expanded in the following ways:

  • User interface enhancements include better propagation of errors and easier creation of DAX formulas.
  • Several functions have been added, including a set of functions that operate on textual data, as well as those that provide common Date and Time manipulations.
  • Several performance and functional capabilities have been introduced, including automatic recalculation.
  • Support for handling referential integrity violations between related tables has been introduced.
  • Automatic selection of data types for expressions is available.
  • For more information, see the topic “Data Analysis Expression Language Reference” in SQL Server 2008 R2 Books Online.

DAX is also documented in the online Help of the PowerPivot for Excel add-in. For more information, see the Help file that is installed with the add-in.

PowerPivot for SharePoint

Management Dashboard

A management dashboard for IT has been introduced. This dashboard provides visibility into PowerPivot use on a SharePoint farm that includes information about the following areas, as well as a rich PowerPivot workbook that can be used to build custom reports:

  • Published PowerPivot workbooks, including number of queries executed, number of users and size per workbook in the dashboard.
  • Hardware resource utilization for the PowerPivot service, including CPU and memory, is reported on a daily basis to the dashboard.
  • Data refresh activity is reported for all PowerPoint workbooks that are accessed through the same PowerPivot service application.

For more information about this feature, see the topic “PowerPivot Management Dashboard” in SQL Server 2008 R2 Books Online.

Enhanced Infrastructure

The following enhancements to infrastructure are available:

  • Claims-aware support to enable a user’s identity to flow between SharePoint components in a secure manner.
  • Parallelized refresh of PowerPivot data.
  • Performance optimizations.

The Aggregate Function

Reporting Services supports an Aggregate function to retrieve aggregated values directly from the data provider if the provider supports this feature. The Aggregate function is especially useful with retrieving data from Analysis Services because it may be difficult if not impossible to recreate aggregated values on the report, such as when the cube script overwrites the server aggregates.

It’s easy to use the Aggregate function. Just replace the default Sum() aggregate function with Aggregate() in the group subtotals. When you do this, the Analysis Services provider brings in additional rows that include the server aggregates which the report displays in the group subtotals.

Recently, I came across a report that would return the following exception after attempting to use Aggregate().

The Value expression for the text box ”<textbox.’ uses Aggregate with an invalid scope parameter.

I couldn’t find anything wrong with the way Aggregate() is used because it was in the right place on a crosstab report. Robert Bruckner from the Reporting Services team helped me understand the Aggregate() function limitations:

The Aggregate function can only be used if the current scope and all parent group scopes are all based on simple field references (e.g. =Fields!Accounts.Value), or constants. The Aggregate function also cannot be used in detail rows.

In my case, the report was using a parent-child Analysis Services hierarchy. To show the data recursively, the tablix Details group was set to group on =Fields!Account.UniqueName. This is an extended property that the SSAS provider supports and it’s not a simple reference. Changing the grouping to =Fields!Account.Value resolved the issue.

Why We Can’t Connect Anymore?

New OS (Windows 7 in this case) and new issues when trying to connect to Analysis Services on a remote server.

I installed Windows 7 and added it to a corporate domain. I can use SSMS to connect to some SSAS servers but there was one server (the most important of course) which refused the connections with the following dreadful message which I am sure you’ve seen somewhere along the line:

 

TITLE: Connect to Server

——————————

Cannot connect to <servername>.

——————————

ADDITIONAL INFORMATION:

The connection either timed out or was lost. (Microsoft.AnalysisServices.AdomdClient)

——————————

Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. (System)

——————————

An existing connection was forcibly closed by the remote host (System)

 

I was able to connect to that server from other non-Windows 7 machines so I narrowed the issue to Windows 7. Interestingly, running the SQL Server Profiler on the server showed that the right Windows identity was passed so the issue wasn’t with the server itself either. Even more interesting, I had no problems connecting to SQL Server Database Engine on that server using Windows Authentication. Attempting to connect to the server using the SQL Server Profiler on the Windows 7 machine or attempting to browse the cube in BIDS would shown another enlightening message:

“The security database on the server does not have a computer account for this workstation trust relationship”

Binging the internet showed that Kerberos doesn’t agree with Windows Server 2008 on the server and Windows Vista (or Windows 7) on the client. But, in my case, the server was running Windows Server 2003 with SP2. More binging and an Eureka moment after reading KB 976419 which luckily has just been released.

Windows Live ID Sign-in Assistant 6.5 installs a new Security Support Provider (SSP) package. The SSP package causes the Security Support Provider Interface (SSPI) that is used by the Analysis Services OLE DB provider not to switch to NTLM authentication.

You gotta be kidding me! So, the unassuming Live ID Assistant, which every Live application out there recommends, nukes NTLM connectivity to Analysis Services? In case you are interested, since a hotfix wasn’t available yet, I went with method 2, which unregisters the livessp provider, and the problem when away.

I got to say that for the past two years I had my fare share of connectivity issues with Analysis Services. It looks to me that the Analysis Services team has to go back to the OLE DB Provider drawing board and solve these issues. If I can connect to the Database Engine, I should be able to connect to Analysis Services.

Applied Analysis Services 2008 Online Training Class with Teo Lachev

We are excited to announce the availability of an online Analysis Services 2008 class – no travel, no hotel expenses, just 100% content delivered right to your desktop!  This intensive 3-day online class (14 training hours) is designed to help you become proficient with Analysis Services 2008 and acquire the necessary skills to implement OLAP and data mining solutions. Learn how to build an Adventure Works cube from scratch. Use the opportunity to ask questions and study best practices..

102409_2255_AppliedAnal1

Applied Analysis Services 2008 Online Training Class
with Teo Lachev

Date: November 16 – November 18, 2009
Time: Noon – 4:30 pm EDT; 9 am – 1:30 pm PDT
14 training hours total for only $799!

102409_2255_AppliedAnal2

 Attend this class for only $799 and get a free unrestricted e-book copy of the book Applied Analysis Services 2005 by Teo Lachev!

For more information or to register click here!

Intelligencia for Silverlight

After retiring Office Web Components (OWC), which can hardly can pass the web test as it is implemented as an ActiveX control, Microsoft left developers with no web-based browser for Analysis Services. True, for those who can afford the hefty price tag, Microsoft Office SharePoint Server (MOSS) supports server-side Excel spreadsheets that render to HTML. However, while Excel rules the Windows-based OLAP browser space, HTML-based Excel spreadsheets with limited filtering can hardly meet the demand for web-based interactive reporting. Then, there is an option to author Reporting Services OLAP reports but outside limited interactive features their layout is pretty much fixed at design time.

What’s really needed is a Silverlight Analysis Services control that ships with Visual to let developers embed an Excel-like Analysis Services browser into their ASP.NET applications. You need this control and I need it but it’s not coming anytime soon. Meanwhile, third-party vendors rush in to fill the gap.

In a recent project, we used Dundas OLAP Chart for Analysis Services, which I mentioned in one of my previous blogs. Dundas has just released version 7 of this control and I really like what I see. It’s currently the best OLAP-based chart in the industry. The Dundas OLAP Chart for Analysis Services is a regular ASP.NET control with AJAX features for improved user experience. With Silverlight establishing as the web platform of choice for web developers, you may be looking for a Silverlight-based Analysis Services browser. This is where the Intelligencia for Silverlight comes in.

101909_0122_Intelligenc1

I blogged about Intelligencia for Reporting Services before and pointed out where I think it surpasses the MDX Query Designer included in Reporting Services. What IT-Workplace has done this time is bringing their product to the web and it has wisely decided to use Silverlight. The moment I saw the new version, it grabbed my attention. Users familiar with Excel PivotTable reports would find the Intelligenca metadata pane (Field List) very similar. Users can create a report by dragging and dropping objects to the report pane on the right. Actually, I would love to see IT-Workplace adding the Filter, Columns, Rows and Filter panes below it just like Excel. I’d welcome also in-place filtering just like Excel. You got the idea. For me and my users, the more Excel-like the browser is the better. This lowers the learning curve and unifies the desktop vs. web experience.

But Intelligencia for Silverlight is more ambitious than just bringing Excel-like reporting to the web. The control has scriptable interface and a filter control which allows management dashboards to be created by linking grids and filters, as the first link101909_0122_Intelligenc2 on the web page demonstrates.

 

In summary, while still rough around the edges (it’s currently in a beta phase), I think Intelligencia for Silverlight has a lot of potential and is positioned to fill in a wide gap left by Microsoft by letting web developers embed an Analysis Services OLAP browser in their applications. Powered by Silverlight, Intelligencia for Silverlight could bring Excel PivotTables to the web. Visit the Intelligencia for Silverlight web page, download and test the control, and provide feedback to improve the control

Aggregating Many-to-Many Relationships

Relax…this blog is all about Analysis Services and not about polygamy or something worse.

SSAS 2005 went beyond the “classic” OLAP model by introducing flexible dimension-to-measure group relationships, including many-to-many, referenced, fact relationships. M2M relationships can solve some nagging business problems. For example, in a recent project where we had to implement a financial cube, there was a requirement that a G-L account can belong to multiple account groups. This is a classic M2M scenario which can be elegantly solved by introducing a M2M relationship. Adventurizing this, let’s take a look at this schema.

091809_0128_Aggregating1

Here, I changed the Adventure Works schema by adding a DimGroup and DimGroupEmployee tables. The DimGroup dimension is a parent-child dimension that lets the end user slice data by territory. An employee can be associated with one or more territories (think of a sales manager that covers multiple territories. This M2M relationship (groups-territories) is resolved via the DimGroupEmployee table. In the cube, the DimGroup is configured to join the Reseller Sales measure group via a Group Employee factless measure group that is bound to the DimGroupEmployee table. This lets us browse the reseller measures by group.

091809_0128_Aggregating2

So, everything is cool, right? Yes, at least until I realized that the aggregated totals need to be adjusted for some account groups. To demonstrate this, suppose that you need a scope assignment that overwrites the Georgia’s total.

([Group].[Groups].[Georgia], [Measures].[Reseller Sales Amount]) = 10;

091809_0128_Aggregating3

Do you see the problem? The Group1 total didn’t budge. However, if you do the same on a parent-child dimension that joins the measure group via a regular relationship, the total would change. Microsoft declared this behavior with M2M by design. When you come to think about it, it makes sense. To aggregate the measures, the server must go via the bridge table (DimGroupEmployee) to find the qualifying members (DimEmployee) that belong to the member on the one side of the relationship. So, when you make an assignment, you are essentially overwriting this behavior.

If a calculation has a slice in a M2M dimension, it applies to a cell only if the cell coordinate matches the M2M slice. When a cell is at a higher level for the M2M attribute, the calculation does not apply and will not be rolled up to the parent level. Intuitively, M2M attribute is non-aggregatable. In this case, when calculating grand total, the SSAS calculation engine will not sum up the subtotals from each group. However, if the assignment is made on many side (DimEmployee in our case), then the aggregating on DimGroup will work as expected.

Now, of course as a modeler, you would hope of a magic switch to let you roll up the assigned values to align the M2M behavior with regular relationships but for now we have to deal with the “by design” behavior. Microsoft suggested a workaround to introduce a scope assignment that forces the M2M members to aggregate, something like:

SCOPE ([Group].[Groups].Members);  

this =

  IIF (IsAncestor ([Group].[Groups].CurrentMember, [Group].[Groups].[Georgia]),

       sum([Group].[Groups].CurrentMember.children),

     ([Group].[Groups].CurrentMember, [Measures].[Reseller Sales Amount])

     );   

End
scope;

This works but it may introduce a performance penalty with larger dimensions. Instead, in our case, we decided not to use an assignment at all. We introduced a new helper measure to the measure group (FactResellerSales in this case) that would perform the required calculations. Since the measure values are mapped multiple times to the many side, we use simple assignments to zero out the measure for the groups that it doesn’t apply.

Yet Another Relative Dates Implementation

Yet another relative dates implementation in an Analysis Services cube. So far, I’ve implemented two. But I’m now starting a project to implement an accounting (G/L) cube and the traditional relative date approach where each relative date is calculated for the current member in the Date dimension doesn’t fly anymore. Heck, the business analyst even found the excellent “Translating Cognos PowerPlay Transformer Relative Date Functionality into SQL Server Analysis Services (SSAS)” whitepaper by David Greenberg and told me that’s exactly what they need. That’s what happens when you have smart users. Power Pivot anyone? While the business users are not empowered yet, let me share how I implemented this approach.

What’s Needed?

The difference now is that we want to drill down from each relative date to the periods that the relative date spans. Let’s take a look at the screenshot below.

082009_0151_YetAnotherR1

The cube sets the default member of the Date hierarchy to the accounting closing period, which is May 2008 in this case. The Current Date relative date shows the actual measure values for May 2008 only. The Current QTD shows the actual measure values for April and May 2008. The current QTR shows the actual measure values for April, May, and June, etc. In other words, when the user slices by Relative Dates and Date dimension, the system shows the actual measure values for the periods that belong to that relative date. Consequently, the only way for the user to get the calculated relative dates values, such as the Current YTD aggregate is to force the Date dimension to the [All] member but that’s fine with the business users.

Why We Disagree?

My implementation differs from the above-mentioned whitepaper in the following ways:

  1. I use a regular Relative Dates dimension instead of dimension calculated members that the SSAS BI Wizard and the whitepaper use. It looks to me that OLAP clients compete with each other to butcher dimension calculated members the most. Excel, for example, doesn’t retrieve calculated members by default and it doesn’t let you select individual calculated members. Other tools ignore them whatsoever. To avoid such pitfalls, I defined the Relative Dates dimension as a regular dimension.
  2. The whitepaper defines extra columns in the DimDate table that flag the qualifying dates. My implementation doesn’t do this because it’s not needed.
  3. The whitepaper produces the relative dates results by aggregating the measure across the qualifying date members. My approach uses simple scope assignments as you will see in a moment.

How This Got Implemented?

The implementation of this Relative Dates approach is remarkably simple.

  1. Implement a SQL View on which the Relative Dates dimension will be based that hardcodes the Relative Dates members with SELECT…UNION statements. Note that first row on which will join the Date dimension doesn’t have a name because we will hide it in the next step.

082009_0151_YetAnotherR2

  1. Set up a Relative Dates dimension. Here I use a little trick to hide the first member because the user shouldn’t see it. Specifically, I set the AttributeHierarchyVisible of the Relative Date dimension key to False. Then, I created a Relative Dates hierarchy and set the HideMemberIf property of the Relative Dates level to NoName.

    082009_0151_YetAnotherR3

  2. I added a named calculation called RelativeDateKey to the DimDate table in the data source view with the expression value of 1 so I can relate the Relative Dates and Date dimensions.

082009_0151_YetAnotherR4

  1. Next, we relate the Relative Dates dimension to all associated measure groups with a referenced relationship via the Date dimension.

082009_0151_YetAnotherR5

  1. Time for some MDX programming. We use script assignments to populate the Relative Dates cells. By default, all cells intersecting with all Relative Dates members (except for the RelativeDateKey member) are empty. So, we just need to “carry over” the measure values from the qualifying members of the Date dimension. Here is what the first three assignments look like:

— Current Date

Scope ([Relative Dates].[Current Date], [Date].[Date].DefaultMember);

this = [Relative Dates].&[1];

End Scope;

 — Current QTD

Scope ([Relative Dates].[Current QTD], Exists(PeriodsToDate([Date].[Date Hierarchy].[Quarter], [Date].[Date Hierarchy].DefaultMember), [Date].[Date].[Date].Members));

this = [Relative Dates].&[1];

End Scope;

 — Current QTR

Scope ([Relative Dates].[Current QTR], Exists(Descendants(Ancestor([Date].[Date Hierarchy].CurrentMember, [Date].[Date Hierarchy].[Quarter]), [Date].[Date Hierarchy].[Date]), [Date].[Date].[Date].Members));

this = [Relative Dates].&[1];

End Scope;

Prior to these assignments, the cube script sets the default member of the Date hierarchy to the accounting closing period. Each assignment scopes on the qualifying members of the Date dimension and sets the scope to [Relative Dates].&[1] , which as we’ve seen is the first member of the Relative Dates dimension.

Why Does This Work?

The script carries over the measure values to the Relative Date members that intersect with the members of the Date attribute hierarchy. From there on, the measure aggregation function kicks in to roll up the values, e.g. by month, year, and quarter.

Reports as Data Feeds

Reporting Services SQL Server 2008 R2 features reports as Atom data feeds. This is accomplished with a new Atom Data Feed extension, as shown on this screenshot.

081409_0200_ReportsasDa1

This extension produces an atom service document (an *.atomsvc file). It lists at least one data feed for each data region in a report. In fact, some regions, such as tablix with column groups (cross-tab report layout) may generate multiple feeds.

At this point, you probably imagine subscribing to reports and receiving feeds in Outlook when report data is changed. Unfortunately, you won’t get very far doing so. As it turns out, atom data feeds are different from Atom feeds that most applications, such as IE and Outlooks understand directly. So even if you extract the Url out of the service document and display the .atom rendering of a report in IE, the browser won’t show useful information because it doesn’t understand that the data rows in the data feed are the important part.

So, how are report data feeds useful then? At this point, the design goal for reports as data feeds was to support Gemini – the new Excel 2010 extension that lets end users build in-memory cubes. Gemini can acquire data from variety of data sources, including data feeds and SSRS reports. Here is what it takes to subscribe to a report feed.

081409_0200_ReportsasDa2

  1. In the Gemini add-in, click the From Data Feed button.
  2. On the Connect to Data Feed step of the Table Import Wizard, browse to the data feed file you exported from the report. In this case, I am pointing to the atomsvc file exported from Company Sales 2008 report. On this step, you can set up advanced options, such as timeout, authentication, etc, and test the feed.

    081409_0200_ReportsasDa3

  3. On the Select Tables and Views step, select the regions on the report whose feeds you want to subscribe to. Since Company Sales 2008 has only one region, only one data feed is exposed. You can click the Preview button to preview the feed data. In the preview window, you can select which columns to import and set up a row filter (doesn’t seem to be working in the August CTP).

    081409_0200_ReportsasDa4Reporting Services, Analysis Services, Microsoft Office, SQL Server 2008 R2

  4. On the Import Summary step, preview the feed options and click Finish to import the data.

This imports the feed data into the Excel spreadsheet. From there, you can use Gemini to build the in-memory model and analyze the report data.

How to Test SSRS MDX Queries in SQL Server Management Studio

I often need to capture an MDX query that a report sends to Analysis Services and execute it in SQL Server Management Studio (SSMS). I usually do this to understand what parameters the report passes to the query and to troubleshoot the query itself. Unfortunately, SSMS doesn’t support parameterized MDX queries so you have to resort to the following technique:

Capturing MDX Queries

The easiest way to capture an MDX query is to use the SQL Server Profiler.

  1. Start SQL Server Profiler from the Microsoft SQL Server <version>-> Performance Tools program group.
  2. Click File -> New Trace or press Ctrl+N.
  3. In the Connect To Server dialog box, choose Analysis Services, enter your server name, and click Connect.
  4. In the Trace Properties dialog box, click the Event Selection tab. Check the Show All Events and Show All Columns checkboxes.
  5. Optionally, if you want to see only your events, that is events triggered by your actions only, click the Column Filters button. In the Edit Filter dialog box, select NTUserName. Expand the Like tree node on the right and enter your Windows login surrounded by %, e.g. %t_lachev%.
  6. Back to the Trace Properties dialog box, click Run to start the trace.
  7. Now run the report whose queries you want to capture and watch the profiler output for a Query Begin event.

Typically, a report would execute several MDX queries because there will be queries for the report parameters. You can use the profiler find function (Ctrl+F) to search for the query text you need in the TextData column in order to find the report main query (that one that supplies the report with data).

  1. Selec t the Query Begin event of the query you want. In the lower pane of the profiler, you will see the query MDX statement followed by Parameters and PropertyList nodes, such as:

    WITH MEMBER MyKPI as StrToMember( @KPISelector)

    SELECT NON EMPTY {MyKPI} ON COLUMNS,

    NON EMPTY StrToSet(@Dataset)

    * StrToMember(@StartPeriod) : StrToMember(@EndPeriod) ON ROWS

    FROM Viking

    WHERE (StrToMember(@Organization) )

    CELL PROPERTIES VALUE, BACK_COLOR, FORE_COLOR, FORMATTED_VALUE, FORMAT_STRING, FONT_NAME, FONT_SIZE, FONT_FLAGS

    <Parameters xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns=”urn:schemas-microsoft-com:xml-analysis”>

    <Parameter>

    <Name>KPISelector</Name>

    <Value xsi:type=”xsd:string”>[Measures].[Return On Average Assets]</Value>

    </Parameter>

    <Parameter>

    <Name>StartPeriod</Name>

    <Value xsi:type=”xsd:string”>[Date].[Month].&amp;[20060101]</Value>

    </Parameter>

    <Parameter>

    <Name>EndPeriod</Name>

    <Value xsi:type=”xsd:string”>[Date].[Month].&amp;[20061201]</Value>

    </Parameter>

    <Parameter>

    <Name>Dataset</Name>

    <Value xsi:type=”xsd:string”>{ [Dataset].[Dataset].&amp;[-2],[Dataset].[Dataset].&amp;[-1] }</Value>

    </Parameter>

    <Parameter>

    <Name>Organization</Name>

    <Value xsi:type=”xsd:string”>[Organization].[Organization Hierarchy].&amp;[-1]</Value>

    </Parameter>

    </Parameters>

    <PropertyList xmlns=”urn:schemas-microsoft-com:xml-analysis”>

    <Catalog>VANTAGE_CUBE_M_BASELINE DEV</Catalog>

    <LocaleIdentifier>1033</LocaleIdentifier>

    <Format>Tabular</Format>

    <Content>SchemaData</Content>

    <Timeout>0</Timeout>

    <ReturnCellProperties>true</ReturnCellProperties>

    <DbpropMsmdFlattened2>true</DbpropMsmdFlattened2>

    </PropertyList>

  2. Click anywhere in the lower pane and press Ctrl+A to select all text, then Ctrl+C to copy it to the clipboard.

Testing MDX Queries

Now that you have the query, you are ready to test it in SSMS.

  1. Open SSMS and connect to the Analysis Services server.
  2. Right-click on the SSAS database and click New Query -> MDX to open and create a new MDX query. Press Ctrl+V to paste the query text that you captured in the profiler.
  3. Since the profiler escapes certain characters, first you need to unescape them. For example, replace (Ctrl+H) all &amp; with &.
  4. One by one, go through the parameters listed in Parameters node and replace the parameter placeholder in the MDX query with the actual value. For example, press Ctrl+H to search for
    @KPISelector (note that the parameters are prefixed with @ in the MDX query) and replace all occurrences (Replace All option) with the actual parameter value. Since typically SSRS MDX queries use StrToMember or StrToSet functions, you need to surround the parameter values with quotes, e.g. “[Measures].[Return On Average Assets]”.
  5. Once you replace all parameters, comment the Parameters and PropertyLists nodes by selecting all text in these nodes and clicking the Comment Out the Selected Lines toolbar button. The only text that it not commented should be the MDX query text.

At this point, you can execute the query by pressing Ctrl+E or clicking the Exclamation toolbar button.

UPDATE

When things get tedious, the developer writes a program. As Greg Galloway pointed out, Darren Gosbell has already written a small but incredibly useful utility, SSAS Query Capture, that automates the process of capturing the queries and replacing the parameters. Moving forward to Visual Studio 2008 and SQL Server 2008, you need to:

  1. Upgrade the source code to Visual Studio. To do so, just open the C# source project in Visual Studio 2008 and accept the defaults.
  2. Delete the Microsoft.AnalysisServices and reference Microsoft.AnalysisServices.dll which you can find in the \Program Files\Microsoft SQL Server\100\SDK\Assemblies\ folder.
  3. Build the project and run QueryCapture.exe in the bin\debug folder.

A cautionary note: The utility always encloses the parameter values in quotes because it assumes that they will be inside the StrToMember and StrToSet functions which SSRS MDX Query Designer generates when you add parameters. If you reference parameters literally, as I demonstrated in this blog, remove the quotes surrounding these parameter values. Otherwise, the chances are that the query will return no data.

I couldn’t imagine how much time Darren’s Query Capture utility could have saved me if I only knew about it! I hope he adds it to the BIDS Helper project.

Suppressing Auto-generation of MDX Parameter Datasets

There is an unfortunate issue with the MDX Query Designer in SSRS 2008 Report Designer and Report Builder 2.0 where changing the main dataset overwrites the parameter datasets. This is a rather annoying issue because often you need to make manual changes to the parameter datasets. However, when you make a change to the main dataset, the MDX Query Designer wipes out the manual changes and auto-generates the parameter databases from scratch. According to the feedback I got from Microsoft, the issue will be fixed in SQL Server 2008 R2.

Meanwhile, there is a simple workaround which requires manually changing the dataset definition in RDL. Basically, you need to add a SuppressAutoUpdate designer switch to the parameter dataset, as follows:

<Query>

<DataSourceName>Viking</DataSourceName>

<CommandText> …

<rd:SuppressAutoUpdate>true</rd:SuppressAutoUpdate>

<rd:Hidden>false</rd:Hidden>

</Query>