Analysis Services Processing Performance

Analysis Services has a very efficient processing architecture and server is capable of processing rows as fast as the data source can provide. You should see processing rate in the ballpark 40-50K rows/sec or even better. One of my customers just bought a new shiny HP ProLiant BL680c server only to find out that processing time went three times higher than the old server. I did a simple test where I asked to execute the processing on both the old server and the new server. The query on the old server would return all rows within 2 minutes, while the same query would execute for 20 minutes which averages to about 4K rows processed/sec. This test ruled out Analysis Services right off the bat. It was clear that the network is the bottleneck. Luckily, the server had a lot of processing power, so processing wasn’t ten times slower.

As it turned out, the company has a policy to cap the network traffic at the switch for all non-production subnets or security and performance reasons. Since the new server was still considered a non-production server, it was on plugged in to a restricted network segment. The moral of this story is that often basic steps could help you isolate and troubleshoot “huge” issues.

DynamicHeight Bug

The chart region in Reporting Services 2008 introduced the ability to dynamically size charts by setting the DynamicHeight and DynamicWidth properties, as Robert Bruckner explained in his blog. This feature is really useful and I hope one day it makes to the other regions as well. A customer recently reported an issue with their reports where regions would overlap when the report is previewed in Print Layout mode or exported to a hard page renderer, such as PDF. For example, in the report below the radar chart is positioned after the bar chart in RDL. However, in Print Layout preview the radar chart overlaps the bar chart. The customer tried every possible combination to enclose one or more regions in rectangles which helped avoiding the overlapping issue to some degree but introduced other issues.


After some digging, I discovered that the issue is caused by the fact that the bar chart is configured for dynamic height and managed to confirm that this is a bug. I will post an update when I learn more. Meanwhile, one possible workaround is to re-arrange the report so the region with dynamic height (hopefully, it’s only one) appears last on the report.

UPDATE (9/18/2010)

Robert Bruckner provided the following workaround which fixed the DynamicHeight issue for me:091610_1142_DynamicHeig1

1. Add a table with a single static cell (one row and one column).

2. Delete the table Details group.

3. Nest the chart in the table.

Importing SSAS KPIs in PerformancePoint

Forging ahead through the unchartered land of PerformancePoint 2010, I ran into a snag today. Attempting to import Analysis Services KPIs resulting in the following error:

An unexpected error occurred.  Error 47205.

Exception details:

System.IO.FileNotFoundException: Could not load file or assembly ‘Microsoft.AnalysisServices, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’ or one of its dependencies. The system cannot find the file specified.

File name: ‘Microsoft.AnalysisServices, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91’

   at Microsoft.PerformancePoint.Scorecards.Server.ImportExportHelper.GetImportableAsKpis(DataSource asDataSource)

   at Microsoft.PerformancePoint.Scorecards.Server.PmServer.GetImportableAsKpis(DataSource dataSource)

Since in our case, PerformancePoint was running on a SharePoint web front end server (WFE) which didn’t have any of the SQL Server 2008 components installed, it was clear that PerformancePoint was missing a connectivity component. Among many other things, I tried installing the Analysis Services Management Objects (AMO) from the SQL Server 2008 Feature Pack but the error won’t go away. I fixed it by running the SQL Server 2008 setup program and installing the Client Tools Connectivity option only. Then, the KPIs magically appear in the Scorecard Wizard.

090710_1533_ImportingSS1

PerformancePoint 2010

PerformancePoint? Is it still around? It is (in SharePoint 2010), and it should peak your interest if you are serious about dashboarding. The planning component is of course gone and I have to admit I never had too much faith in it. When comes to dashboards, Microsoft gives you two implementation options:

  1. Reporting services reports in SharePoint web parts – Pros include low cost because Reporting Services is available with SharePoint Foundation, and no need to learn new skills. On the downside, you need to implement your own global filter web parts assuming that you don’t use SharePoint Server.
  2. PerformancePoint – This is tool specifically designed for dashboards and it just got better in SharePoint 2010. However, it requires SharePoint Server 2010 which you need for PowerPivot as well. Unfortunately, this puts you in the $5,000+ upfront investment bucket (Vidas has more to say about SharePoint pricing).

Personally, I was pleasantly surprised when I re-discovered PerformancePoint in SharePoint 2010. Here is a cool little dashboard I put together in a couple of hours after importing the Adventure Works KPIs.

090310_0251_Performance1

I’ve been complaining for a while that Microsoft doesn’t have a web-based OLAP browser. PerformancePoint reporting capabilities (chart and grid) come pretty close. Below is a grid report bound to the Adventure Works reseller data. It would be really cool if PerformancePoint continues the trend to fill in the gap and adds more Excel-like features, such as filters, slicers, etc.

090310_0251_Performance2

Yes, we now have the ProClarity remnants in the form of a Silverlight-based decomposition tree (requires Silverlight 3.0 on the client). To get it, I right-clicked a cell on the report and clicked Decomposition Tree. This lets me analyze sales by any dimension.

090310_0251_Performance3

So, what’s the catch except the cost? The ridiculously difficult Kerberos configuration of course if you have a multi-server environment. In our case, just when we thought we conquered the Kerberos beast with SSRS, we’ve found the PerformancePoint doesn’t work. As it turned out, unlike Reporting Services, PerformancePoint requires a constrained delegation and uses the Claims for Token service. So, follow the steps in the Configuring Kerberos Authentication for SSRS 2008 R2 with SharePoint 2010 whitepaper closely.

If you have SharePoint 2010 Server already, PerformancePoint definitely warrants your interest.

Kerberos Woes

[Wikipedia] Cerberus, (pronounced /ˈsɜrb(ə)rəs/);[1] Greek form: Κέρβερος, /ˈkerberos/[2] in Greek and Roman mythology, is a multi-headed hound (usually three-headed) which guards the gates of Hades, to prevent those who have crossed the river Styx from ever escaping.

I think Microsoft got the name right although the Windows version of Kerberos has more than three heads for sure. Yet another weekend spent in troubleshooting Kerberos. This one has an interesting setup.

  1. SharePoint Server 2010 on a Web Front End (WFE) box.
  2. SharePoint Server 2010 (core install) + SQL Server and Reporting Services 2008 integrated with SharePoint on a second box.
  3. Analysis Services 2008 R2 on a third box.

The customer wanted this setup to minimize the SQL Server licenses. Microsoft recommends installing Reporting Services on the WFE servers but this requires as many SQL Server licenses as the number of the WFE servers, plus probably two more (for SQL Server to host the SharePoint configuration databases and Analysis Services (if it is on a separate box). By contrast, the above setup requires two SQL Server licenses irrespective of the number of the WFE servers.

We hit issue #1 when we tried to deploy from BIDS and we got a login prompt that just won’t go away. This was related to the fact that the SQL Server setup configures Reporting Services for NTLM authentication. Since this scenario required Kerberos delegation, we have to change the following setting in the rsreportserver.config file:

<Authentication>

<AuthenticationTypes>

    <RSWindowsNegotiate/>

    <!–RSWindowsNTLM/–>

</AuthenticationTypes>

<EnableAuthPersistence>true</EnableAuthPersistence>

</Authentication>

As you can see, we had to disable NTLM and add Negotiate. This helped us to a point where BIDS can deploy reports to the SharePoint site and we can manage data sources and report properties. However, for some reason, after a few minutes the data source and report definitions somehow would get invalidated and trying to access their properties result in HTTP 401 User unauthorized error. Re-deploying the reports and definitions “fix” the definitions for a few minutes and then 401 error again.

More troubleshooting…and we figured out the issue was related to a misconfigured SPN entry for the SSRS service account on the ServicePoint core (database) server by executing setspn -l <SSRS service account>. The results didn’t include http/<the account used for SQL Server>. Once we got the Active Directory administrator fix this, the issue went away.

We found the Configuring Kerberos Authentication for SSRS 2008 R2 with SharePoint 2010 whitepaper very useful. It’s a must read before venturing into the Kerberos candy land

Good luck!

Applied PowerPivot Course Available

082310_0058_PowerPivotT1I am excited to announce that Prologika has added an Applied PowerPivot course to the list of our training offerings in response to the strong interest for self-service BI. The class can be delivered as two-day online class (4 hours each day) or as one full-day onsite class. The cost for the online version is $599. Applied PowerPivot is designed to help students become proficient with PowerPivot and acquire the necessary skills to implement PowerPivot applications, perform data analysis, and share these applications with other users. The full course syllabus is available here. I scheduled the first run for September 21st. Happy self-service BI!

PowerPivot Implicit Data Conversion

A student noticed an interesting behavior while I was teaching my PowerPivot class last week. He noticed that PowerPivot lets you join a text-based column to a numeric column. As it turns out, PowerPivot does an implicit conversion to text when it discovers that you are attempting to join columns of different data types. However, as a best practice, you should convert numeric columns to a numeric data type, such as Whole Number. This will help PowerPivot minimize storage and improve performance.

Professional Microsoft PowerPivot for Excel and SharePoint

I had the pleasure to read the book Professional Microsoft PowerPivot for Excel and SharePoint by Sivakumar Harinath, Ron Pihlgren, and Denny Guang-Yeu Lee. All of the authors are with the Microsoft Analysis Services team. Together with David Wickert, Denny Lee runs the http://powe073110_0327_Professiona1rpivottwins.com/ blog, which is “dedicated to all things PowerPivot”. Siva has written the 2005 and 2008 editions of the Professional SQL Server Analysis Services with MDX. So, the book is straight from the horse’s mouth.

I really liked the book. It’s easy to follow and includes insightful tips. The book targets business users who are interested in creating self-service BI solutions. IT professionals and BI practitioners who will manage PowerPivot applications will find the book useful as well. Professional Microsoft PowerPivot for Excel and SharePoint incudes tutorials that walks the user through the necessary steps to build a PowerPivot-based solution. These steps include importing the data, enriching the data model, creating calculations, and reports. The code samples use a sample database for the healthcare industry – I guess that’s because healthcare is the only industry with a budget for IT applications nowadays, or at least self-service BI apps. Adventure Works is probably waiting for the economy to bounce back.

The authors are quick (page 17) to explain the role of the Corporate BI now that the self-service BI era has dawned on us – something that I was eager to get the official Microsoft opinion about. Their short answer is “not much will change”, and that’s precisely what I think. IMO, self-service BI is a compromise between corporate BI (best) and reality. Ideally, I don’t want business users to do anything but analyze. Why should business users import, relate data, and create calculations that as complexity increases look more bizarre then the equivalent MDX expressions? But we don’t live in a perfect world and PowerPivot comes to fill in the void. Here are some scenarios where PowerPivot might be a good fit:

1. You don’t have a data mart and Excel spreadsheets rule the reporting world. With some help from IT, business users might be able to get the data in the right format and report from it right in Excel.

2. You do have a data mart but you don’t want/don’t have the skills to build a cube. But if you’ve gone that far, why not travel the remaining 20% and have a cool cube that business users trust and love?

3. You have a data mart/cube but not all data is in the data mart/cube or you need to mash up data from various data sources.

4. You suffer from the NIHS (Not Invented Here Syndrome) and you just want to do things your way despite what IT and BI guys like your humble correspondent tell you.

All of these scenarios require compromises in terms of data quality, security, features, and implementation effort. The problems that we aim to solve with corporate BI are not trivial and there are no shortcuts. I know it and you know it. So, be skeptical when you see an impressive presentation that shows how a self-service BI will take upon the corporate BI, no matter if the tool is PowerPivot, QlickView, Tableau, or something else.

Overall, I’d highly recommend this book to anyone interested in PowerPivot and self-service BI. The book does a great job explaining the capabilities and limitations of the tool. I couldn’t stop reading it!

PivotViewer Extension for Reporting Services – a Cool BI Fusion with Confusing Name

Although late to the party, I announce the arrival of a very interesting BI technology – PivotViewer Extension for Reporting Services – CTP1. What’s interesting is that it leverages several BI technologies to provide great data visualization:

  1. The Microsoft LiveLabs Pivot which was developed by Microsoft Research.
  2. PowerPivot – a Microsoft self-service BI tool.
  3. Analysis Services R2
  4. Reporting Services R2
  5. SharePoint 2010

You should keep a close eye of this combination (or at least the last four) as it will become increasingly important in the Microsoft BI stack. Robert Bruckner has provided details about the new arrival. Christian Petculescu, a Principal Architect on the SSAS team, is the driving force behind it. Other MVPs have spread the news. But Kasper de Jonge made my day! He has written a GREAT blog about it with step by step instructions to create your own PivotViewer solution. His blog also cleared my confusion about the role of Reporting Services in this architecture. So, Reporting Services has not been extended in any way. It’s just been used an image generator to produce the Pivot images. A better name for the tool could have been a PivotViewer Add-in for SharePoint.

Interestingly, Microsoft Pivot does it work by counting items in collections. It doesn’t support any other aggregation functions. Therefore, PivotViewer is not a replacement for Excel PivotTable. It gives you an additional way to visualize you data.

UPDATE

Amir Netz, the mastermind behind PowerPivot, demonstrated PivotViewer during the second day keynote of the BI conference this year. Forward to 1:21. He also talks about the new features that the next version or PowerPivot (SQL Server 11) will bring in, including KPI support, BIDS development support, etc.

Atlanta BI Group First Meeting Topic Announced

I’ve just updated the Atlanta BI Group home page to announce the topic for our first meeting on August 23th. Given the great interest surrounding Self-service BI, I’ll present “Self-service BI with Microsoft PowerPivot”.

Hope you can make our first meeting!