Posts

PowerPivot Configuration Tool

One great enhancement in SQL Server 11 (Denali) is the PowerPivot for SharePoint setup refactoring. Previously, the PowerPivot for SharePoint setup and configuration was included and performed at the end of the SQL Server setup. Consequently, it was very difficult to figure out what went wrong. In Denali, you need to perform the PowerPivot for SharePoint configuration as a post-installation step after the SQL Server setup completes. You can do this using the new PowerPivot Configuration Tool, SharePoint Central Administration, or PowerShell.

You can find the PowerPivot Configuration Tool in the SQL Server Denali CTP3 ð Configuration Tools program group. When you open the tool, it examines the SharePoint setup and presents a list of tasks (if any) that need to be done to configure PowerPivot for SharePoint so it’s operational. Once you validate the tasks and run them, the tool would perform the steps one at a time and indicate which step is currently taking place. You can select task and click the Script tab to see a PowerShell script for that task or the Output tab to see an execution log.

If a step fails, the tool stops and the rest of the steps are not executed. You need to examine what went wrong, fix it, and rerun the tool to execute the remaining tasks. You can also use the tool to remove features, services, applications. and solutions.

080811_0126_Transaction1

SharePoint is great product but it’s also very complex and difficult to troubleshoot. The PowerPivot Configuration Tool will simplify the management effort to configure PowerPivot for SharePoint.

PowerPivot and SQL Server Denali CTP3 Release Notes Available

Microsoft published release notes for PowerPivot and SQL Server Denali Community Technology Preview (CTP) 3 Release Notes. This means that the CTP3 release is imminent although the download link is not live yet. The release notes cover the new features pretty well.

UPDATE 7/12/2011

CTP3 got released today.

Business Intelligence Semantic Model – TechEd Redux

While I’ve been busy at TechEd talking to BI enthusiasts, professionals and wannabes and delivering sessions, the news is out about the new BI roadmap and BISM positioning. See:

  • T.K. Anand blog
  • T.K. Anand & Ashvini Sharma recorded session What’s New in Microsoft SQL Server Code-Named “Denali” for SQL Server Analysis Services and PowerPivot”?
  • Sean Boon, Carolyn Chao, and Sabrena McBride recorded session Abundantly “Crescent”: Demos Galore
  • Julie Strauss’ recorded session The Next Generation Design Tools for Analysis Services
  • And the feedback from the community insiders, including Chris Webb’s Good News on the Future of Analysis Services blog, Marco Russo’s Microsoft Updates its BI Roadmap blog, and Boyan Penev’s SSAS to BISM – Recent Developments blog.

So, I won’t repeat what’s been already said. Instead, I’ll attempt to summarize my thoughts about the new BI developments and give you some take-home points to help you plan future BI work.

  • I spent many hours with the SSAS and SSRS product groups at TechEd. I liked what I saw. I saw the SSRS and SSAS teams working together. I saw the impressive visualization capabilities of the sleek Crescent tool which will give the competition a run for their money. And I saw the future of Analysis Services and Microsoft BI platform – the new BISM model that is positioned to become what UDM promised to be.
  • BISM is now the unifying name for both the Multidimensional (OLAP) and Tabular or Relational (in-memory) models. Both models will support MDX and DAX query options. Consequently, Excel can see a tabular model as a cube and query it with MDX, while Crescent will be able to see an OLAP cube as a tabular model and send DAX queries to it. This is awesome news that warrants retreating the Ugly part from my blog Business Intelligence Semantic Model – The Good, The Bad, and the Ugly. Kudos to Microsoft for listening to the community and kudos to MVPs for providing relentless feedback!
  • Make no mistake though. Although the external interfaces are the same and there is a significant feature overlap, in Denali the two models will not compatible with each other and no migration path will be provided, e.g. OLAP to Tabular or vice versa. An analogy might be helpful here. Think of Microsoft Access and SQL Server relational databases. Both of them are databases, both support SQL, and both have similar features (tables, relationships, etc.). We could use the label “Relational Database Model” as a unifying name for both databases although each path leads to a completely different model. So, why we are implying a unification here? Think of BISM as a prediction of the future. In time, Tabular may “borrow” OLAP features, such as MDX scripts and OLAP may get Tabular’s elements, such as in-memory partitions. Thus, the divide between the models might blur to a point where the two paths converge into a single model.
  • The most important thing to remember is that the Tabular model is not a replacement for OLAP cubes. Anybody and I repeat anybody, who claims the opposite should be brought to justice and tried in the BI Court. T.K. Anand from the Microsoft SSAS team has actually a faster procedure of strangling that person with his bare hands. You should see Tabular as completing and not competing technology to OLAP, a technology that opens interesting possibilities that were not previously possible with OLAP alone.
  • Regardless of naming and marketing, the question in everybody’s mind moving to Denali would be which path and model to choose for new projects. And, here is my simplified personal plan based on the type of the BI project at hand:

    Self-service BI – This is a no-brainer – PowerPivot and Excel. If you want to let business users analyze data on their own, look no further than Excel.

    Team BI – How do business users share PowerPivot models with other users? SharePoint and PowerPivot of course. Another no-brainer.

    Corporate BI – This is where things get trickier. Let’s go through a few scenarios:

  1. If you’ve started on the PowerPivot path but have exceeded the PowerPivot capabilities, e.g. you need more data (above 2GB) or row-level security, then a PowerPivot model migrated to Analysis Services running in a VertiPaq mode will be a logical choice. This will give you a nice continuum of self-service BI – Corporate BI on a single platform.
  2. If all you need is a simple model that wraps a layer over a raw database or warehouse for reporting & analytics and you don’t have experience with OLAP, consider a tabular model and DAX because of its lower learning curve and less rigid data schema requirements. The issue I have with the “simple” model is that based on my experience many projects start simple but grow exponentially in complexity over time. A many-to-many relationship may creep in, as well as time calculations, currency conversion, etc. So, be careful here, evaluate requirements (if you have them) and take your time to weigh out features and prototype because switching from one model to another will require a complete rewrite. Continuing the Relational Database analogy above, it will feel like moving from Access to SQL Server and you won’t have a wizard. Denali Tabular won’t support OLAP features that we grew to love, such as declarative parent-child and many-to-many relationships (it does support them via DAX calculations but you’ll end up with a calculated measure for each numeric column as in the SSAS 2000 days), scope assignments for implementing time calculations, allocations and currency conversion, Excel what-if analysis, default members, named sets, and others. Although easy to start with, DAX calculations can become complex to a point where you might wonder what you are really saving by switching from MDX to DAX if you are after avoiding the MDX learning curve.
  3. Enterprise-wide BI and Data Warehousing – Here, I’d stick with multidimensional OLAP cubes for now. I won’t bet complex and high-visibility projects on Tabular, at least not for Denali. First, more than likely such projects will require the features I mentioned above. Second, I’d give Tabular some time to marinate in the corporate space. Finally, in Denali only Crescent will support Tabular natively via DAX. In comparison, there are many clients that are optimized to support MDX and OLAP cubes.

I’d like to indulge myself and think that one day, I hope in not so distant future, BISM will evolve to become a true single model that delivers on the UDM promise and combines the best of OLAP and Tabular. Then, we would be able to pick and mix features from both OLAP and in-memory paths, e.g. an in-memory storage with MDX calculations, and use the best technology for the task at hand. Until then, we will have to choose between the multidimensional and tabular paths.

052011_0128_BusinessInt1

TechEd 2011 Keynote

Watching the TechEd keynote during which it was announced that we have some 10,000 people attending TechEd this year. On the BI side of things, Amir Netz, Distinguished Engineer at Microsoft, came on stage to unveil some of the new features coming up in Denali and demonstrate the BI continuum story in Denali (PowerPivot, SharePoint with Crescent reporting, and Corporate BI). The Corporate BI model had some 2 billion rows cached in-memory VertiPaq engine. You will be able to use Visual Studio to “take over” from Excel and build models exceeding the Excel limitation of 2GB file size. Wait for more exciting news tomorrow at sessions DBI206 – What’s New in Microsoft SQL Server Code-Named “Denali” for SQL Server Analysis Services and PowerPivot and DBI211 – What’s New in Microsoft SQL Server Code-Named “Denali” for Reporting Services!

 

On the Windows Phone area, interesting news was that there will be an upgrade later on this year that would let mobile users integrated with Office365 and Lync Mobile. Unfortunately, no news to make BI available to Windows 7 phones yet.

Denali Forums

Microsoft launched SQL Server 11 (Denali) pre-released forums and is eagerly awaiting your feedback. Judging by the number of BI-related forums, one can easily see that BI will be a big part of Denali with major enhancements across the entire BI stack. Of course, not many questions to ask if we don’t have the bits yet to play with (CTP1 doesn’t include the BI stuff) but still you can start probing Microsoft.

VertiPaq Column Store

In SQL 11, the VertiPaq column store that will power the new Business intelligence Semantic Model (BISM) will be delivered in three ways:

1. PowerPivot – in-process DLL with Excel 2010.

2. A second storage mode of Analysis Services that you will get by installing SSAS in VertiPaq mode.

3. A new column stored index in the SQL RDBMS.

The third option picked up my interest. I wanted to know if a custom application will be able to take advantage of these indexes outside VertiPaq. For example, this could be useful for standard reports that query directly the database. It turns out that this will be possible. The following paper discusses columnstore in more details. In a nutshell, SQL 11 will run the VertiPaq engine in-process. Here are some highlights of the VertiPaq columnstore:

  • You create an index on a table.
  • You cannot update the table after the index is created. Not a big limitation since a data warehouse is typically loaded on a schedule so the index can be dropped and rebuilt at the end of the ETL process.
  • The relational engine scans the entire table and loads it into memory in a Vertipaq store. As you could imagine, the more RAM the better.
  • Query performance will be great but probably slower than VertiPaq in SSAS.
  • Creation of this index isn’t cheap, but the query performance gains should justify using it in common DW scenarios.
  • From performance standpoint, it will be best to load data in VertiPaq (SSAS store) instead of using column store.

More Details about BISM (UDM 2.0)

It looks like this year SQL PASS was the one to go. A must read blog from Chris Webb who apparently shares the same feelings and emotions about the seismic change in Analysis Services 11 as I do. A must read…

Crescent on the Horizon

Now that the official word got out during the Ted Kummert’s keynote today at SQL PASS, I can open my mouth about Crescent – the code name of an ad-hoc reporting layer that will be released in the next version of SQL Server – Denali. Crescent is a major enhancement to Reporting Services and Microsoft Self-Service BI strategy. Up to now, SSRS didn’t have a web-based report designer. Denali will change that by adding a brand new report authoring tool that will be powered by Silverlight. So, this will be the fifth report designer after BIDS, Report Builder 1.0 (not sure if RB 1.0 will survive SQL 11), Report Builder 3.0, Visual Studio Report Designer.

Besides brining report authoring to the web, what’s interesting about Crescent is that it will redefine the report authoring experience and even what a report is. Traditionally, Reporting Services reports (as well as reports from other vendors) have been “canned”, that is, once you publish the report, its layout becomes fixed. True, you could implement interactive features to jazz up the report a bit but changes to the original design, such as adding new columns or switching from a tabular layout to a crosstab layout, requires opening the report in a report designer, making the changes, and viewing/republishing the report. As you would recall, each of the previous report designers would have separate design and preview modes.

Crescent will change all of this and it will make the reporting experience more interactive and similar to Excel PivotTable and tools from other vendors, such as Tableau. Those of you who saw the keynote today got a sneak preview of Crescent and its capabilities. You saw how the end user can quickly create an interactive report by dragging metadata, a-la Microsoft Excel, and then with a few mouse clicks change the report layout without switching to design mode. In fact, Crescent doesn’t have a formal design mode.

How will this magic happen? As it turns out, Crescent will be powered by a new ad-hoc model called Business Intelligence Semantic Model (BISM) that probably will be a fusion between SMDL (think Report Builder models) and PowerPivot, with the latter now supporting also relational data sources. The Amir’s demo showed an impressive response time when querying billion rows from a relational database. I still need to wrap my head around the new model as more details become available (stay tuned) but I am excited about it and the new BI scenarios it will make possible besides traditional standard reporting. It’s great to see the Reporting Services and Analysis Services teams working together and I am sure good things will happen to those who wait. Following the trend toward SharePoint as a BI hub, Crescent unfurtantely will be available only in SharePoint mode. At this point, we don’t know what Reporting Services and RDL features it will support but one can expect tradeoffs given its first release, brand new architecture and self-service BI focus.

So, Crescent is a code name for a new web-based fusion between SSRS and BISM (to be more accurate Analysis Services in VertiPaq mode). I won’t be surprised if its official name will be PowerReport. Now that I picked your interest, where is Crescent? Crescent is not included in CTP1. More than likely, it will be in the next CTP which is expected around January timeframe.