BISM Tabular Dynamic Data Security Over Bridge Table

About four years ago, in my article Protect UDM with Dimension Data Security, I introduced an approach to secure dimension data in multidimensional cubes by using a factless bridge table. Since then, many BI practitioners have favored this approach because of its simplicity, reusability (transactional reports can join to the security table), and performance. Now that organizational tabular projects support data security, I wanted to try it with BISM Tabular. As it turned out, the factless bridge table approach worked out just fine.

Suppose that you’ve build a tabular project on top of the Adventure Works DW database and the SecurityResellerFilter table is the bridge table that stores the resellers that an employee is authorized to view. As shown in the diagram, SecurityResellerFilter has two relationships SecurityResellerFilter[EmployeeKey] and Employee[EmployeeKey] and SecurityResellerFilter[ResellerKey] and Employee[ResellerKey]. You’d probably want to hide the SecurityResellerFilter table so the end users can’t browse it.

092811_1402_BISMTabular1

Given this model, implementing a dynamic data security over SecurityResellerFilter requires setting up a new role (click the Roles button in the Analysis Services toolbar) that uses a row filter.

092811_1402_BISMTabular2

A row filter defines a filter expression that evaluates which rows the role is allowed to see. To set up a row filter in the Role Manager, enter a DAX expression next to the table name. The DAX expression must be a Boolean expression that returns TRUE or FALSE. The Adventure Works role uses the following DAX expression for the row filter on the Employee table:

=CONTAINS(RELATEDTABLE(SecurityResellerFilter), SecurityResellerFilter[EmployeeKey], LOOKUPVALUE(Employee[EmployeeKey], Employee[LoginID], USERNAME()))

The LOOKUPVALUE function is used to obtain the employee key associated with employee’s Windows login which we get from the Username function. Because the row filter is set on the Reseller table, for each reseller the CONTAINS function attempts to find a match for the combination of the reseller key and the employee key. Notice the user of the RELATEDTABLE function to pass the current reseller. The net effect is that the CONTAINS function returns TRUE if there is a row in the SecurityResellerFilter table that matches the ResellerKey and EmployeeKey combination.

You can use the Analyze in Excel feature (click the Analyze in Excel button in the Analysis Services toolbar) to test the security changes. To verify that the user has access only to resellers the user is authorized to see, add the ResellerName field from the Reseller table in the Column Labels zone.

092811_1402_BISMTabular3

 

The report should show on columns only the resellers that are associated with the interactive user in the SecurityResellerFilter table.

Rendering Reports in Chrome and Safari

A client complained that Reporting Services ReportViewer doesn’t render reports in Google Chrome or Apple Safari browsers. The magic property to apparently fix this issue was SizeToReportContent=”True”

<rsweb:ReportViewer
ID=”reportViewer” runat=”server” Height=”100%” Width=”100%”
SizeToReportContent=”True”>

</rsweb:ReportViewer>

Here is a link to Microsoft’s official stand on the browser support for Reporting Services. A couple of notes:

  1. Currently Microsoft doesn’t support report rendering in Chrome but are re-visiting the browser support matrix periodically.
  2. Report Manager doesn’t work in Safari or Firefox. Report Manager supports IE only. However, report rendering by URL or ReportViewer should work with the above change.

Specifying a Default Report Name on Export with ReportViewer

Normally, when you export a report, the Save As dialog box will default to the report name. However, if you the SSRS LoadReportDefinition API and view the report in the ReportViewer, you’ll find that when you export the report it will default the name to a system generated name, such as Report(N).pdf. That’s because LoadReportDefinition creates a transient report on the server that doesn’t have a name. However, the ReportViewer LocalReport and ServerReport objects support a DisplayName property that can be used to specify a custom report name, such as:

reportViewer.ServerReport.DisplayName = “My Report”;

// the following lines are for reference only

reportViewer.ProcessingMode = Microsoft.Reporting.WebForms.ProcessingMode.Remote;

reportViewer.ServerReport.ReportServerUrl = new
Uri(ConfigurationSettings.AppSettings[Shared.CONFIG_RS_ENDPOINT]);

SQL Server DENALI CTP3 Demo VHD Available

Microsoft just released a Hyper-V image of SQL Server Denali CTP3 in action, including fully configured services and integration with SharePoint 2010 and Office 2010. The following software is configured on the virtual machine:

• SQL Server “Denali” CTP3

• SharePoint 2010

• Office 2010

CTP3 Issues with PowerPivot Management Dashboard

I ran into a couple of snags with the PowerPivot Management Dashboard and CTP3 of Denali. In this case, I’ve installed PowerPivot for SharePoint on a standalone domain controller.

  1. The Actions web part of the PowerPivot Management Dashboard refuses to load with the following exception logged in the SharePoint URL log ” Unable to acquire information about the NT group ‘Enterprise Admins'”
    Resolution: This is a known issue that will be fixed in RTM. The workaround is to (1) delete any accounts that display their SIDs, i.e. they were part of a domain that is no longer valid, and (2) to delete any universal or enterprise groups
    Connect item: https://connect.microsoft.com/SQLServer/feedback/details/685945/system-exception-unable-to-acquire-information-about-the-nt-group-enterprise-admins
  2. The dashboard doesn’t display any data although usage data is collected. This is caused by the fact that the PowerPivot system account doesn’t have rights to the PowerPivot SQL database and the timer job fails. The SQL Profiler shows the following error “The SELECT permission was denied on the object ‘vHealth’, database ‘DefaultPowerPivotServiceApplicationDB’, schema ‘Usage’. An error occurred while processing the ‘Health’ table.”
    Resolution: Grant the PowerPivot system account rights to all tables and views in the PowerPivot database
    Connect item: https://connect.microsoft.com/SQLServer/feedback/details/685955/no-data-in-powerpivot-management-dashboard-due-to-insufficient-rights

     

Reporting Services "Denali" Brings Export to Excel 2007

One welcome improvement in the forthcoming “Denali” release of Reporting Services is exporting to Excel 2007 (*.xlsx) file formats. I bet many companies will upgrade just to get this feature. In fact, the default rendering option when you export to Excel is to xlsx. The old Excel 2003 renderer is still there but it’s hidden by default. And, yes, BI projects now integrate with Visual Studio 2010!

For more details about Excel 2007 rendering, please read the Exporting to Microsoft Excel (Report Builder 3.0 and SSRS) topic in BOL.

082011_1609_ReportingSe1

082011_1609_ReportingSe2

Reporting Services SharePoint Integration in SQL Server Denali

A week ago, I raved about how the new PowerPivot Configuration Tool would simplify configuring PowerPivot for SharePoint. Guess what? Setting up Reporting Services for SharePoint integrated mode got simplified too! In SQL Server Denali, Reporting Services leverages the SharePoint service application infrastructure and it doesn’t require installing a Reporting Services server. Not only this simplifies setup but improves performance because there is no round-tripping between SharePoint and report server anymore. Configuring Reporting Services for SharePoint integration mode is a simple process that requires the following steps:

  1. Run the SQL Server Denali CTP3 setup and Install Reporting Services and the add-in on the SharePoint application server.081411_2142_ReportingSe1
  2. Once the bits are installed, open SharePoint Central Administration, and click Manage Service Applications. Expand the New button and click SQL Server Reporting Services Service application.081411_2142_ReportingSe2
  3. In the page that follows, configure the service application details as per your specific environment and click OK.081411_2142_ReportingSe3
  4. As with the previous releases, one more step is required to enable the SSRS content types in a given document library.

    a) Go to the document library settings page and click Advanced Settings. In the Advanced Settings page, select Yes for the Allow management of content types?

    081411_2142_ReportingSe4

    b) Back to the document library settings page, click the Add from Existing Site Content Types link.
    081411_2142_ReportingSe5
    c) In the Add Content Types page, add BISM Connection File (assuming PowerPivot for SharePoint is configured), and the Report Builder content types.

    081411_2142_ReportingSe6

Voila! Once this is done, you can create Crescent reports. You just need to create a BISM Connection File that points to a PowerPivot model or BISM Tabular model.

081411_2142_ReportingSe7

Transactional Reporting with BISM Tabular

Continuing on BISM Tabular performance (see my Cube vs. VertiPaq Query Performance blog), I was disappointed to find that there are no performance improvements in PowerPivot/BISM Tabular in CTP3 in the area of detailed (transactional-level) reporting which has plagued me since the R2 release. Consider the following report build on top of an Adventure Works PowerPivot model.


This report shows all Adventure Works customers and their order information by combining data from the Customer (18,000 rows) and InternetSales (60,000 rows) tables. Specifically, the report includes two columns from the Customer table and four columns from the InternetSales table. The report has a slicer to let the user filter a customer. Given that PowerPivot is an end-user tool it’s likely that end users would build such reports.

Unfortunately, each time you click the slicer or filter a column, the report query takes about ten minutes (!) to execute. One would surely expect better performance from the “engine of the devil”. To be fair to VertiPaq, Excel OLAP PivotTable is designed for summarized reports and not transactional reports. I am not excluding the possibility of a bug in the VertiPaq engine because a similar report connected to the Adventure Works cube takes milliseconds to execute. Here is the MDX statement generated by PowerPivot report when I filter on Aaron:

SELECT {[Measures].[Sum of OrderQuantity 2],[Measures].[Sum of SalesAmount 2]} DIMENSION PROPERTIES PARENT_UNIQUE_NAME,HIERARCHY_UNIQUE_NAME ON COLUMNS , NON EMPTY Hierarchize(DrilldownMember(DrilldownMember(DrilldownMember(CrossJoin({[Customer].[FirstName].[All],[Customer].[FirstName].[FirstName].AllMembers}, {([Customer].[LastName].[All],[InternetSales].[SalesOrderNumber].[All],[InternetSales].[SalesOrderLineNumber].[All])}), [Customer].[FirstName].[FirstName].AllMembers, [Customer].[LastName]), [Customer].[LastName].[LastName].AllMembers, [InternetSales].[SalesOrderNumber]), [InternetSales].[SalesOrderNumber].[SalesOrderNumber].AllMembers, [InternetSales].[SalesOrderLineNumber])) DIMENSION PROPERTIES PARENT_UNIQUE_NAME,HIERARCHY_UNIQUE_NAME ON ROWS FROM (SELECT ({[Customer].[FirstName].&[Aaron]}) ON COLUMNS FROM [Sandbox]) CELL PROPERTIES VALUE, FORMAT_STRING, LANGUAGE, BACK_COLOR, FORE_COLOR, FONT_FLAGS

We would probably need to wait for the next version of Office to fix this performance issue at its core. Meanwhile, avoid Excel pivot reports and consider Crescent for detail-level reporting. Crescent generates native DAX queries and I verified that detail-level reporting is efficient.

Please vote for the connect feedback I’ve submitted on this subject.

UPDATE 8/14/2011

The slowdown is caused by the subselects added to the MDX query when applying a slicer or column filtering. Specifically, when the query requests calculations, the server applies Autoexists before doing NON EMPTY which could be rather expensive over a large space. Filtering on a field added to the Report Filter zone is very fast because the query uses a WHERE clause.

PowerPivot Configuration Tool

One great enhancement in SQL Server 11 (Denali) is the PowerPivot for SharePoint setup refactoring. Previously, the PowerPivot for SharePoint setup and configuration was included and performed at the end of the SQL Server setup. Consequently, it was very difficult to figure out what went wrong. In Denali, you need to perform the PowerPivot for SharePoint configuration as a post-installation step after the SQL Server setup completes. You can do this using the new PowerPivot Configuration Tool, SharePoint Central Administration, or PowerShell.

You can find the PowerPivot Configuration Tool in the SQL Server Denali CTP3 ð Configuration Tools program group. When you open the tool, it examines the SharePoint setup and presents a list of tasks (if any) that need to be done to configure PowerPivot for SharePoint so it’s operational. Once you validate the tasks and run them, the tool would perform the steps one at a time and indicate which step is currently taking place. You can select task and click the Script tab to see a PowerShell script for that task or the Output tab to see an execution log.

If a step fails, the tool stops and the rest of the steps are not executed. You need to examine what went wrong, fix it, and rerun the tool to execute the remaining tasks. You can also use the tool to remove features, services, applications. and solutions.

080811_0126_Transaction1

SharePoint is great product but it’s also very complex and difficult to troubleshoot. The PowerPivot Configuration Tool will simplify the management effort to configure PowerPivot for SharePoint.

Cube vs. VertiPaq Query Performance

This is a big topic and frankly it’s too ambitious on my part to tackle it. Assuming equivalent multidimensional (BISM Multidimensional) and tabular (BISM Tabular) models, I was curious how a multidimensional cube fares against VertiPaq in terms of performance. To be fair to VertiPaq, I decided to use native DAX queries. As you’ve probably heard, BISM Tabular in SQL Denali will include a variant of DAX to query tabular models deployed to SharePoint and SSAS running in VertiPaq mode. Chris Webb has a good writeup about DAX queries here. The DAX EVALUATE construct allows external clients to query tabular models using native DAX syntax instead of MDX. Since BISM Tabular speaks DAX, DAX queries are likely to be more efficient and give you better performance when querying tabular models. At this point, only Crescent generates native DAX queries. The DAX query syntax is:

DEFINE

MEASURE Table1 [measure1] = <DAX_Expression>

MEASURE Table2 [measure2] = <DAX_Expression>

EVALUATE <DAX Table Expression>

ORDER BY

    <DAX_Expression> [ASC | DESC]

    <DAX_Expression> [ASC | DESC]

START AT

    Value_or_Parameter, Value_or_Parameter, …

To have a more sizable dataset, I used the Contoso cube for my tests. I created a BISM Tabular model and imported the Contoso data. Since you probably don’t have Contoso, I provide queries that target the Adventure Works cube. I’ve started with the following unoptimized MDX query which calculates the average sales amount by date across products whose daily sales exceed the daily sales for the same date in the previous month:

WITH


MEMBER [Measures].SlowAvg AS


Avg

(


Filter

(

[Product].[Product].[Product].MEMBERS

,[Measures].[Sales Amount] > ([Measures].[Sales Amount], ParallelPeriod([Date].[Calendar].[Month]))

)

,[Measures].[Sales Amount]

)

SELECT

[Measures].SlowAvg ON 0,

[Date].[Calendar].[Date].Members
ON 1

FROM [Adventure Works];

Then, I optimized the query to take advantage of block computation mode, as follows:

WITH


MEMBER diff as
iif ([Measures].[Sales Amount] > ([Measures].[Sales Amount], ParallelPeriod([Date].[Calendar].[Month])), [Measures].[Sales Amount], null)


MEMBER [Measures].SlowAvg AS


Avg

(

[Product].[Product].[Product].MEMBERS, diff

)

SELECT

[Measures].SlowAvg ON 0,

[Date].[Calendar].[Date].Members
ON 1

FROM [Adventure Works];

Finally, my equivalent DAX query that used to measure performance was:

define measure FactResellerSales[TotalSales] = Sum([SalesAmount])

measure FactResellerSales[TotalSales – LastYear] = [TotalSales](SamePeriodLastYear(DimDate[FullDateAlternateKey]), All(DimDate))

measure FactResellerSales[AverageSales] = AverageX(Filter(Values(DimProduct[ProductKey]), [TotalSales] > [TotalSales – LastYear]), [TotalSales])

evaluate addcolumns(filter(values(DimDate[DateKey]), not isblank([AverageSales])), “AverageSalesAmount”, [AverageSales])

order by [DateKey]

And, the findings from the tests:

1.       MDX query un-optimized (cell calculation model) both on cold cache and executed second time – 33 sec

2.       MDX query optimized (block computation mode) on cold cache – 4.8 sec

3.       MDX query optimized (block computation mode) executed second time – 0.7 sec

4.       DAX query both on cold cache and executed second time – 6.4 sec

Here are some take-home notes:

  1. The fact that VertiPaq is an in-memory database doesn’t mean that it will perform much better than a multidimensional cube. The formula engine of BISM Multdimensional does cache query results in memory. So does the Windows OS. In fact, the more the cube is used, the higher the chances that its data will end up in memory.
  2. VertiPaq might give you good performance without special tuning. All DAX calculations run in a block computation mode.
  3. Optimized MDX queries might outperform VertiPaq especially if results are cached.
  4. DAX queries are never cached which explains why DAX queries perform the same when executed subsequently.

The fact that VertiPaq gives you a head start doesn’t mean that you cannot write inefficient DAX queries. For example, the following DAX measure definition returns the same results but it’s twice as slow.

measure FactResellerSales[AverageSales] = AverageX(Filter(AddColumns(Values(DimProduct[ProductKey]), “x”, [TotalSales]), [x] > [TotalSales – LastYear]), [x])

Again, this is an isolated test case and your mileage might vary greatly depending on queries, data volumes, hardware, etc. But I hope you could use it as a starting point to run your own tests while waiting for a VertiPaq performance guide.