Presenting at Atlanta.MDF

I’m presenting at the Atlanta.MDF group on Monday, January 12th. I’ll be covering a wide range of tips and techniques for analyzing and improving performance of SQL Server-based data analytics solutions. Hope you can make it.

Title: Can Your Data Analytics Solution Scale?

Abstract: Does your ETL exceed its processing window? Do your users complain about the SSRS spinny? Can your SQL Server database design deliver the expected performance? Can the system scale to thousands of users? Join this session to learn best practices and tips for isolating bottlenecks and improving the performance of data analytics solutions. I’ll dissect the layers of a “classic” solution (relational database, ETL, data model, reports) and share solutions harvested from real-life projects to address common performance-related issues. 

SQL Server and 20 Cores Limit

Scenario: You execute a SQL Server 2012 task that uses parallelism, such as index rebuild or a query on a server with more than 20 cores running SQL Server 2012 Enterprise Edition. In the Windows Task Manager, you observe that the task uses only 20 cores. We discovered this scenario during a rebuild of a columnstore index. To confirm this further, you examine the SQL Server log and notice that a similar message is logged when the SQL Server instance starts:

“SQL Server detected 8 sockets with 4 cores per socket and 4 logical processors per socket, 32 total logical processors; using 20 logical processors based on SQL Server licensing. This is an informational message; no user action is required.”

Explanation: More than likely, you have upgraded to SQL Server 2012 from SQL Server 2008 R2 under Software Assurance. Microsoft created a special SKU of Enterprise Edition to support this scenario with the caveat that this SKU limits an instance to using only 20 processor cores (or 40 CPU threads if hyperthreading is enabled). If this level of parallelism is not enough, the only solution is to switch to the Enterprise Edition SKU that is licensed per core and purchase a license that covers as many cores as needed. Once you obtain the new license key, you can upgrade your SQL Server instance:

Setup.exe /q /ACTION=editionupgrade /INSTANCENAME=MSSQLSERVER /PID=<PID key for new edition>” /IACCEPTSQLSERVERLICENSETERMS

Tabular M2M Relationships on the Horizon

One of the biggest strengths of Microsoft self-service BI is the ability to create sophisticated data models on a par with organizational BI models built by professionals. This fact is often overlooked when organizations evaluate self-service tools and the decision is often made based on other factors but not insightful understanding of the data model capabilities. This is unfortunate because most popular tools on the market don’t go much further than supporting a single dataset. By contrast, Power Pivot allows you to import easily multiple datasets from virtually anywhere and join the resulting tables as you can do in Microsoft Access. This brings tremendous flexibility and analytical power.

Unlike multidimensional cubes, one of the limitations of the Power Pivot and Tabular data models has been the lack of support for declarative many-to-many relationships. The workaround has been using a simple DAX formula to resolve the relationship over a bridge table, such as =CALCULATE (SUM (Table[Column] ), <BridgeTable>) but this approach might present maintenance issues, as you have to create multiple calculated measures to support different slicing and dicing needs. However, as pointed out in my latest newsletter, the upcoming version of Power BI aims to remove adoption barriers and adds new features. And, one of this features, is bidirectional relationships and declarative support of M2M relationship, which Chris Webb already wrote about.

To test the M2M relationship, I attempted to create the same M2M scenario that I used in my book, which models a joint bank account. The corresponding Power Pivot schema is shown below. The CustomerAccount table is the bridge table that resolves the M2M relationship (a customer might have many accounts and a bank account might be shared by multiple customers). The Balances table stores the account balances over time and the Date table lets us analyze these balances over time.

010415_1902_TabularM2MR1

Setting up a M2M relationship in the Power BI Designer is achieved by changing the “Cross filter direction” relationship setting to Both. This setting and bi-directional relationships are described in more details here.

010415_1902_TabularM2MR2

Indeed, creating a report that shows balances by customer resolves the M2M relationship and aggregates correctly.

010415_1902_TabularM2MR3

Unfortunately, attempting to slice the report by Date returns an error in the preview version of the Power BI Designer so the M2M feature is still a work in progress. Brining this further, a useful addition could be declarative semi-additive functions to allow the user to set the aggregation behavior of the Balance measure, such as to LastNonEmpty. Similar to Multidimensional, this will avoid the need for user-defined explicit measures.

Getting ETL Task Duration

Happy New Year!

ETL exceeds the processing time window? Optimizing ETL, starts with obtaining task-level execution times? If you use SSIS 2012 project deployment mode, task-level stats are already loaded in the SSIS catalog and you can use the following query:

SELECT execution_id,

CASE WHEN [status] = 1 THEN ‘created’

WHEN [status] = 2 THEN ‘running’

WHEN [status] = 3 THEN ‘canceled’

WHEN [status] = 4 THEN ‘failed’

WHEN [status] = 5 THEN ‘pending’

WHEN [status] = 6 THEN ‘ended unexpectedly’

WHEN [status] = 7 THEN ‘succeeded’

WHEN [status] = 8 THEN ‘stopping’

WHEN [status] = 9 THEN ‘completed’

END AS [status_text],

DATEDIFF(ss,start_time,end_time) DurationInSeconds

FROM catalog.executions e

What if you are not on SSIS 2012 or later yet or you are not using the project deployment mode or a framework that logs the task duration? You can still obtain the task duration but you need to enable SSIS logging for each package you want to monitor, as follows:

  1. Open the package in BIDS/SSDT.
  2. On the SSIS menu, click Logging. Configure logging to use the SSIS Log Provider for SQL Server. The provider will save the statistics in a SQL Server table so you can easily query the results.010215_2129_GettingETLT1
  3. On the Details tab, select the OnPreExecute, OnPostExecute, and most importantly the OnProgress event so you can get the same level of execution statistics as in the BIDS/SSDT Progress tab.010215_2129_GettingETLT2
  4. Once you configure the SSIS Log Provider for SQL Server, it will create a sysssislog table in the database you specified when you configured the provider. When the SQL Server Agent executes your package, you can use a query like the one below to obtain task-level durations:

    SELECT executionid Execution,

    PackageName = ( SELECT TOP 1 source FROM dbo.sysssislog S WHERE S.executionid = L.executionid AND S.[event] = ‘PackageStart’ ),

    PackageStep = l.source,

         Runtime_min = DateDiff(minute, MIN(L.starttime), MAX(L.endtime)),

    Runtime_hr = DateDiff(hour, MIN(L.starttime), MAX(L.endtime)),

    StartTime = MIN(L.starttime),

    EndTime = MAX(L.endtime),

    FinishDate = CAST(MAX(L.endtime)AS DATE)

    FROM dbo.sysssislog L WHERE L.[event] != ‘PackageStart’
    GROUP BY executionid, L.source