Business Intelligence Semantic Model – The Good, The Bad, and the Ugly

UPDATE 11/14/2010

This blog probably will be the most updated blog I’ve ever written. I have to admit that my initial reaction to BISM was negative. For the most part, this was a result of my disappointment that Microsoft switched focus from UDM to BISM in Denali and my limited knowledge of the BISM vision. After SQL PASS, I exchanged plenty of e-mails and Microsoft was patient enough to address them and disclosed more details. Their answers helped me to “get it” and see the BISM big picture through more optimistic lenses.

 

UPDATE 05/21/2011

Having heard the feedback from the community, Microsoft announced at TechEd 2011 that Crescent will support OLAP cubes as data sources. This warrants removing the Ugly part from this blog.

 

After the WOW announcement at SQL PASS about PowerPivot going corporate under a new name, Business Intelligence Semantic Model (BISM), there were a lot of questions from the SSAS community. Your humble correspondent did his own share of nagging. For some obscure reason, this time Microsoft decided to keep MVPs and community leaders in the dark until PASS, so I was as unprepared as the rest of the community about what is to come. Prior to PASS, the SSAS team told me that corporate BI would be a major focus in Denali, which I interpreted as enhancements to UDM, only to find out that the entire buzz is about BISM…what a thunder from a clear sky. To its credit, the SSAS team was quick in its attempt to cover the collateral damage as we can see in the posts below from Amir Netz and T.K. Anand.

http://cwebbbi.wordpress.com/2010/11/11/pass-summit-day-2/#comment-1498

http://blogs.technet.com/b/dataplatforminsider/archive/2010/11/12/analysis-services-roadmap-for-sql-server-denali-and-beyond.aspx

I decided to put down the gist about what I’ve learned about BISM so I have a ready answer for questions that I am sure I will be asked over and over. Let’s start with the WHY.

Why BISM?

While SSAS is now an undisputed leader in the OLAP space and the BI platform of choice, Microsoft is facing competitive pressures. As you know, new vendors and technologies are popping up like daisies every day in the fast-changing BI landscape attacking on all fronts – QlikView, Tableau, Business Object Universe, to name a few, and Microsoft cannot rest on its laurels. The main selling point of these companies is that OLAP is too complex. I’ve seen their marketing campaigns in action and I know how persuasive they could be. One interesting thing I’ve noticed though is that the sleek demos I’ve seen all used star schemas, where data is neatly organized and stored in a set of dimension and fact tables. Creating this schema and populating it usually take about 60-80% or the development effort for implementing a BI solution.

So, if the star schema is not obsolete, what’s so appealing then in the offerings of these vendors? The notion that the OLAP layer is too complex and you don’t need it. This is a clever strategy (and one of the laws of power) – If you can’t have it (OLAP market share) than despise it J Instead, these companies build their analytical layer as a thin wrapper directly on top of the star schema, which is exactly what PowerPivot does. This abstracts IT from knowing all these intimidating OLAP concepts, such as cubes, dimensions, measures, and the most complex of them all – MDX. This is a perceived simplification though because it’s unlikely to meet more demanding requirements. For example, I wouldn’t have been successful implementing BI financial solutions if I didn’t have UDM features, such as scope assignments, parent-child dimensions, and many-to-many relationships.

While I don’t personally believe that UDM (the SSAS implementation of the OLAP layer) is too complex, I do think that it carries a lot of baggage partially due to its deep roots in traditional OLAP and evolution over years. Let’s name a few:

  1. Cubes – The cube as a core OLAP concept that became somewhat outdated after moving to the UDM which is an attribute-based model. Ever wondered if you should go for one cube or multiple cubes? This is one of the main dilemmas when starting a new project.
  2. Detail-level reporting – This is where UDM over-promised but under-delivered. For example, UDM doesn’t support text-based measure. Besides actions, the clumsy workarounds are action or degenerate dimensions, with the latter being what the “classic” OLAP recommends.
  3. Storage complexities – the need to process the cube, partitioning, aggregations, etc. Moving to BISM, you still need to load the data into VertiPaq (unless you do real-time). So there is still “processing” involved but there is no need for aggregations or “index building” anymore. Partitions are still there – but they are needed for data management purposes (incremental updates, rolling off data etc) and not for performance reasons.
  4. Data latency – the UDM proactive caching attempts to address it but it’s far from ideal.

Again, I don’t list complexity because I don’t think UDM is very complex. Or if it is, that’s because business requirements are and I sure don’t want BISM to have less features that will prevent us meeting such requirements. So, the SSAS team had a dilemma: should they continue enhancing UDM or go back to the drawing board. Microsoft realized that the UDM internal architecture and storage model have become increasingly complex over the years, thus barring major changes and growth. Instead of continue building on the old foundation, they’ve decided to start from scratch by going with the VertiPaq column-oriented store that debuted in PowerPivot. I have to give credit to the SSAS team for making such a bold and a gut-wrenching decision, which I am sure hasn’t been made lightly.

The Good

BISM is essentially a fresh start for Microsoft in the BI space. It will attempt to simplify OLAP and make BI even more accessible. As many of the BI professionals agree, based on what we know and seen in PowerPivot, BISM has a huge potential and will bring welcome enhancements, such as:

  1. Schema simplification – no need to define explicit cubes, dimensions, measures, and thus eliminate the perceived complexity of implementing an OLAP solution. In fact, expect the term “OLAP” to be heavily deemphasized in years to come.
  2. Blazing performance – this will probably void partitions and aggregations.
  3. Flexibility – There will be no distinction between measures and dimensions. Every attribute can be used for aggregating and slicing. This is one PowerPivot feature that I like the most.
  4. Detail-level reporting without hacks.
  5. DAX expression-based language that removes some of the complexity of MDX.
  6. Possibility of real-time data access – BISM running in VertiPaq mode will deliver the best performance and it is similar to MOLAP. However, data latency will also be improved if you work in real-time (like ROLAP) against a SQL database that has column store indexes and you don’t query massive data volumes. It should be fast, but still slower than working directly against the BISM VertiPaq.

The Bad

In Denali, BISM will not be as feature-rich as UDM and it probably won’t be for years to come. The lack of more advanced features (we don’t have a feature comparison list yet), will probably make organizations and consultants favor UDM in the interim, which is still the “bread and butter” for MS OLAP. I personally won’t go for a new model knowing that its limitations can bite me in a long run or a model that puts me back a decade ago no matter how simple and fast it is. As you know, all projects start simple until a requirement that changes everything.

So, why didn’t Microsoft left BISM in the self-service BI area to marinate for a while as PowerPivot?

  1. BISM will let Microsoft levels the playing field against the various enterprise BI semantic models
  2. Remove Excel hosting limitations, such the 2 GB size limitation an Excel workbook has. In reality, 2 GB translates to many more gigabytes of uncompressed data but still not enough for corporate data volumes.
  3. Microsoft can now position BISM as a BI platform for both reporting and analytics without abandonment of relational concepts in favor of “weird” OLAP terminology. This is something that mainstream IT will find very attractive.
  4. BISM will provide the continuum of Self-service BI -Corporate BI on a single platform. You can start with self-service BI with PowerPivot and migrate it to a corporate solution.

Unfortunately, since BISM will not get all UDM features at least in Denali, in the interim we will have to make a choice between UDM and BISM, which essentially will be a compromise between features and ease of use. In years, as BISM catches up with UDM, I’d expect the choice to become easier. Meanwhile, BISM will not be as advanced as the UDM and for more advanced requirements UDM is your only choice.

That said, even in its first release, you may find BISM much more advanced and powerful than competing technologies, such as BO Universe and similar metadata storage models. So, when evaluating the BISM capabilities, it is important to pay attention to your reference point. The UDM is a very high bar. However, if you compare it against Universe, Qlikview, etc., you will probably find BISM superior.

The Ugly

Crescent (the new SSRS ad-hoc reporting tool) will support BISM only. Frankly, I don’t know what was the thinking here especially given that UDM is still the “bread and butter” of OLAP (see the links above). The message that everyone would get is that UDM has fallen out of favor which is apparently not the case. I strongly encourage Microsoft to support UDM in Denali even if it won’t make it in the box (no a big surprise there, Report Builder 2.0 has done it and hell didn’t freeze over). I think Crescent has a lot of promise and since we don’t have a webi browser, it is long due on the BI wish list. There are probably good technical reasons for supporting BISM only but if Excel can do MDX against PowerPivot, so should Crescent. This will also reinforce the message that UDM is still the premium model for years to come while BISM is in the works.

So?

BISM has a bright future and will be a winner if it delivers on its promise but doesn’t throw the baby with the bathwater:

  1. Simplifies OLAP
  2. Preserves UDM flexibility and features we have today, especially the ones that came with SSAS 2005, such as flexible dimension relationships and scope assignments. 
  3. Supports both DAX and MDX as query and expression languages to preserve investment developers and vendors have made throughout all these years.

Do I feel that my consulting business will be negatively affected by BISM should it become super user-friendly so even my mom can use it? I don’t think so. Here is where the C++/C# analogy could be helpful. Those C++ developers who took the time to learn C#, which for them was a walk in the park, found that it opened new opportunities. And again, no matter how simple the tool, the requirements are what makes things complex and I don’t think BISM will read minds any time soon. Sure, sometimes an old dog must learn new tricks but that’s why we love it …

May you live in interesting times!
A Chinese proverb

More Details about BISM (UDM 2.0)

It looks like this year SQL PASS was the one to go. A must read blog from Chris Webb who apparently shares the same feelings and emotions about the seismic change in Analysis Services 11 as I do. A must read…

Crescent on the Horizon

Now that the official word got out during the Ted Kummert’s keynote today at SQL PASS, I can open my mouth about Crescent – the code name of an ad-hoc reporting layer that will be released in the next version of SQL Server – Denali. Crescent is a major enhancement to Reporting Services and Microsoft Self-Service BI strategy. Up to now, SSRS didn’t have a web-based report designer. Denali will change that by adding a brand new report authoring tool that will be powered by Silverlight. So, this will be the fifth report designer after BIDS, Report Builder 1.0 (not sure if RB 1.0 will survive SQL 11), Report Builder 3.0, Visual Studio Report Designer.

Besides brining report authoring to the web, what’s interesting about Crescent is that it will redefine the report authoring experience and even what a report is. Traditionally, Reporting Services reports (as well as reports from other vendors) have been “canned”, that is, once you publish the report, its layout becomes fixed. True, you could implement interactive features to jazz up the report a bit but changes to the original design, such as adding new columns or switching from a tabular layout to a crosstab layout, requires opening the report in a report designer, making the changes, and viewing/republishing the report. As you would recall, each of the previous report designers would have separate design and preview modes.

Crescent will change all of this and it will make the reporting experience more interactive and similar to Excel PivotTable and tools from other vendors, such as Tableau. Those of you who saw the keynote today got a sneak preview of Crescent and its capabilities. You saw how the end user can quickly create an interactive report by dragging metadata, a-la Microsoft Excel, and then with a few mouse clicks change the report layout without switching to design mode. In fact, Crescent doesn’t have a formal design mode.

How will this magic happen? As it turns out, Crescent will be powered by a new ad-hoc model called Business Intelligence Semantic Model (BISM) that probably will be a fusion between SMDL (think Report Builder models) and PowerPivot, with the latter now supporting also relational data sources. The Amir’s demo showed an impressive response time when querying billion rows from a relational database. I still need to wrap my head around the new model as more details become available (stay tuned) but I am excited about it and the new BI scenarios it will make possible besides traditional standard reporting. It’s great to see the Reporting Services and Analysis Services teams working together and I am sure good things will happen to those who wait. Following the trend toward SharePoint as a BI hub, Crescent unfurtantely will be available only in SharePoint mode. At this point, we don’t know what Reporting Services and RDL features it will support but one can expect tradeoffs given its first release, brand new architecture and self-service BI focus.

So, Crescent is a code name for a new web-based fusion between SSRS and BISM (to be more accurate Analysis Services in VertiPaq mode). I won’t be surprised if its official name will be PowerReport. Now that I picked your interest, where is Crescent? Crescent is not included in CTP1. More than likely, it will be in the next CTP which is expected around January timeframe.

ResMon Cube Sample

Greg Galloway just published a ResMon cube sample on CodePlex that aggregates execution statistics (rolls up information about Analysis Services such as memory usage by object, perfmon counters, aggregation hits/misses, and current session stats) from Analysis Services dynamic management views (DMVs) and makes it easily available for slicing and dicing in a cube. I think this will be a very useful tool to analyze the runtime performance of an Analysis Services server or as a learning tool to understand how to work with DMVs. Kudos to Greg!

Atlanta BI SIG December Meeting

If you are use Microsoft BI, live in or within driving distance to Atlanta, and don’t know about the Atlanta BI SIG, you are missing a lot. At our last meeting we had some 50+ people and our attendance is growing! Due to the holidays, Atlanta BI SIG will not have a meeting at the end of November and December. Instead, our next meeting will be held on December 6th. I updated the Atlanta BI SIG home page to announce the December meeting.

End of the year is a good time for reflecting on the past and planning for the future. Bob Abernathy from Strategy Companion will present BI past, present, and future trends. He will also show us how Strategy Companion integrates with Analysis Services.

Topic:        BI: Then and Now?
 Level: Beginner
Date:Monday, December 6, 2010
Speaker:
 
Bob Abernethy, SVP & GM of Strategy Companion Corporation
Bob Abernethy is SVP & GM of Strategy Companion Corporation. A veteran of Oracle Corporation and Siebel Systems, Bob brings over twenty years of software industry experience to his discussion with customers about their Business Intelligence implementations. Bob received his Bachelor of Science degree from Cornell University in New York and his Masters of Management Information Systems from West Coast University in Southern California. the current president of the Kansas City SQL Server Users Group.
Overview:
 
We will begin by taking a look how the focus and characteristics of Business Intelligence have changed over the last 25 years. We will also discuss the recent history of Microsoft’s focus on BI, and will take an in-depth look at another approach to SQL Server-based BI provided by Strategy Companion Corporation. You will see why companies such as Citigroup, L’Oreal, Honeywell, DataQuick, and many others have embraced Analyzer, Strategy Companion’s award-winning front-end to Analysis Services, for their Business Intelligence applications. You’ll see why SQL Server magazine recently called Analyzer “the best solution to complete the Microsoft BI platform.” (Editor’s Best Award, December 2009.) And you’ll learn ways to quickly and add significant value to your SQL Server-based data – the kind of value business people will be able to see, understand, and appreciate.
  
Location:Matrix Resources Dunwoody Office
Sponsor:Strategy Companion Corporation
See the overview for the main presentation.

See you there!

More Dundas Components Move to Microsoft

Now, this is great news for Microsoft BI and .NET developers. As announced on the Dundas website, the agreement between Dundas and Microsoft, which allows Dundas to continue sell, enhance, update and support its components, will expire on October 31, 2010. Subsequently, all components change hands and become a Microsoft intellectual property. These components are:

  • Dundas Chart for ASP.NET (Professional and Enterprise), Dundas Chart for Window Forms (Professional and Enterprise), Dundas Chart for SharePoint, Dundas Chart for SQL Server Reporting Services, Dundas Chart for OLAP Services
  • Dundas Gauge for ASP.NET, Dundas Gauge for Windows Forms, Dundas Gauge for SharePoint, Dundas Gauge for SQL Server Reporting Services
  • Dundas Map for ASP.NET, Dundas Map for Windows Forms, Dundas Map for SQL Server Reporting Services
  • Dundas Calendar for SQL Server Reporting Services

For me, the most interesting of these is the Dundas Chart for OLAP Services. If you have followed my blog, you know that I’ve been complaining on a regular basis that after retiring OWC, Microsoft didn’t provide any suitable web-based OLAP browser. In fact, this ranked 3rd on my SSAS wish list last year. Although I won’t get a Silverlight-based control anytime soon, I’ve used the Dundas Chart for OLAP Services (AJAX-based) in my projects and I can say great things about it. It lets you add OLAP browsing features to ASP.NET applications very easily. The Dundas Calendar will be a good addition to SSRS and .NET as well.

At this point, it’s not clear how which Microsoft products will acquire which components. Most of them (excluding probably the SSRS counterparts) will get added to Visual Studio. I can’t wait this to happen…

So, what will Dundas do after the transition? The Dundas Dashboard – now in version 2.0, with 2.5 coming up soon. More about the Dundas Dashboard will be coming up …. stay tuned.

UPDATE 10/22/2010

To clarify, Microsoft acquired the intellectual property rights from Dundas back in 2007. At the end of October, Dundas will stop selling these components that they were reselling from Microsoft since the acquisition. Dundas will continue to support them through October 31, 2011 for existing customers with support agreements. Integrating the “orphaned” components into Microsoft offerings is on the Microsoft TODO list. 

Where is x64 Excel 2010 Data Mining Add-in?

Mark Tabladillo delivered a great presentation about using Excel for data mining at our last Atlanta BI SIG meeting. The lack of an Excel 2010 x64 DM add-in came up. There was a question from the audience whether Microsoft has abandoned DM technology since it stopped enhancing it. A concern was also raised that Microsoft might have also neglected the corporate BI vision in favor of self-service BI.

That’s definitely not the case. I managed to get a clarification from the Microsoft Analysis Services team that Corporate BI will be a major focus in “Denali”, which is the code name for the next version of SQL Server – version 11. As far as the long-overdue x64 DM add-in for Excel, Microsoft is working on it and it will be delivered eventually. Meanwhile, the 2007 add-in works with the x32 version of Excel 2010.

For those of you going to SQL PASS (unfortunately, I won’t be one of them), Microsoft will announce important news about Denali and hopefully give a sneak preview about the cool BI stuff that is coming up.

Prologika Training Classes

Our online classes for the remainder of 2010:

Courses

Mentor

Date

Price

 

Applied SSRS 2008Teo Lachev10/26-10/29 12:00-5:00 EDT

$799

Register

Applied SSAS 2008Teo Lachev11/23-11/25 12:00-5:00 EDT

$799

Register

Applied PowerPivotTeo Lachev11/29-11/30 12:00-4:00 EDT

$599

Register

Visit our training page to register and more details.

MVP For Another Year!

Just got the news that my MVP status got extended for another year! This will make it six consecutive years as MVP and as a member of an elite group of professionals that I am proud of belonging to.

Atlanta BI SIG September Meeting

Atlanta BI fans, join our next Atlanta BI SIG meeting! Mark Tabladillo (Ph.D., Industrial Engineering, MCAD.NET, MCT) will show us how to do data mining with PowerPivot. And Dundas will demonstrate their latest BI offering – the Dundas dashboard. Here are the details:

Please RSVP to help us plan food as follows:

  1. Go to the Atlanta BI home page (atlantabi.sqlpass.org).
  2. Choose Yes and submit the RSVP survey found at the right top corner of the page.

 

Main Topic:        Data Mining with PowerPivot 2010
 Level: Intermediate
Date:Monday, September 27, 2010
Time:6:30PM
LocationMatrix Resources

115 Perimeter Center Place

Suite 250 (South Terraces Building)

Atlanta, GA 30346

Speaker:

 

 

 

 

Mark Tabladillo (Ph.D., Industrial Engineering, MCAD.NET, MCT)
Mark Tabladillo provides consulting and training for data mining with Solid Quality Mentors. He has taught statistics at Georgia Tech and for the graduate business school of the University of Phoenix. Mark has years of deep experience with the SAS System, and has presented at many local, regional, and national technical conferences. Mark produces a data mining resource and blog at http://www.marktab.net. the current president of the Kansas City SQL Server Users Group.
Overview:

 

Excel provides a compelling and ubiquitous interface for Microsoft Data Mining. With new features available through PowerPivot, business users can apply the technology through a well-designed infrastructure of Microsoft technologies. This presentation will welcome any newcomers to data mining, and provide interactive demos which highlight data mining through these technologies.

  

Location:Matrix Resources Dunwoody Office
Sponsor
Presentation:
Dundas

Dundas will present their latest BI offering: Dundas Dashboard. Dundas Dashboard is a flexible, turnkey solution for the rapid development of business dashboards. Whether you are leveraging an existing BI infrastructure/application or starting a standalone project from scratch, Dundas offers the industry’s most cost-effective platform for creating/deploying sophisticated digital dashboards and empowering users quickly and easily.