Why We Can’t Connect Anymore?

New OS (Windows 7 in this case) and new issues when trying to connect to Analysis Services on a remote server.

I installed Windows 7 and added it to a corporate domain. I can use SSMS to connect to some SSAS servers but there was one server (the most important of course) which refused the connections with the following dreadful message which I am sure you’ve seen somewhere along the line:

 

TITLE: Connect to Server

——————————

Cannot connect to <servername>.

——————————

ADDITIONAL INFORMATION:

The connection either timed out or was lost. (Microsoft.AnalysisServices.AdomdClient)

——————————

Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host. (System)

——————————

An existing connection was forcibly closed by the remote host (System)

 

I was able to connect to that server from other non-Windows 7 machines so I narrowed the issue to Windows 7. Interestingly, running the SQL Server Profiler on the server showed that the right Windows identity was passed so the issue wasn’t with the server itself either. Even more interesting, I had no problems connecting to SQL Server Database Engine on that server using Windows Authentication. Attempting to connect to the server using the SQL Server Profiler on the Windows 7 machine or attempting to browse the cube in BIDS would shown another enlightening message:

“The security database on the server does not have a computer account for this workstation trust relationship”

Binging the internet showed that Kerberos doesn’t agree with Windows Server 2008 on the server and Windows Vista (or Windows 7) on the client. But, in my case, the server was running Windows Server 2003 with SP2. More binging and an Eureka moment after reading KB 976419 which luckily has just been released.

Windows Live ID Sign-in Assistant 6.5 installs a new Security Support Provider (SSP) package. The SSP package causes the Security Support Provider Interface (SSPI) that is used by the Analysis Services OLE DB provider not to switch to NTLM authentication.

You gotta be kidding me! So, the unassuming Live ID Assistant, which every Live application out there recommends, nukes NTLM connectivity to Analysis Services? In case you are interested, since a hotfix wasn’t available yet, I went with method 2, which unregisters the livessp provider, and the problem when away.

I got to say that for the past two years I had my fare share of connectivity issues with Analysis Services. It looks to me that the Analysis Services team has to go back to the OLE DB Provider drawing board and solve these issues. If I can connect to the Database Engine, I should be able to connect to Analysis Services.

Applied Analysis Services 2008 Online Training Class with Teo Lachev

We are excited to announce the availability of an online Analysis Services 2008 class – no travel, no hotel expenses, just 100% content delivered right to your desktop!  This intensive 3-day online class (14 training hours) is designed to help you become proficient with Analysis Services 2008 and acquire the necessary skills to implement OLAP and data mining solutions. Learn how to build an Adventure Works cube from scratch. Use the opportunity to ask questions and study best practices..

102409_2255_AppliedAnal1

Applied Analysis Services 2008 Online Training Class
with Teo Lachev

Date: November 16 – November 18, 2009
Time: Noon – 4:30 pm EDT; 9 am – 1:30 pm PDT
14 training hours total for only $799!

102409_2255_AppliedAnal2

 Attend this class for only $799 and get a free unrestricted e-book copy of the book Applied Analysis Services 2005 by Teo Lachev!

For more information or to register click here!

Phantom URLs

An interesting “issue” popped up today. I did a fresh install of Windows 7 on my brand new HP 8530w laptop, which is a great laptop BTW. I install SQL Server 2008 followed by SP1 only to find that the Web Service URL and Report Manager URL don’t show up in the Reporting Services Configuration Manager.

1122.ora1.png-550x0

The report sever would run just fine on http://localhost/reportserver and the URLs were registered correctly in the rsreportserver.config. I went as far as whipping out some WMI code and it would return the URLs as it should. But the darn URLs won’t show up. Not to mention that another user has recently reported the exact same issue to me. A bug? I started thinking of re-installing SQL Server.

Then, a Eureka moment! As with most things, there was an easy explanation. For some reason, Windows 7 has set the system font size to be 125%. Resetting it to 100% fixed the issue although it could have been a good idea to increase the label height in Reporting Services Configuration Manager to avoid such font scaling issues.

Intelligencia for Silverlight

After retiring Office Web Components (OWC), which can hardly can pass the web test as it is implemented as an ActiveX control, Microsoft left developers with no web-based browser for Analysis Services. True, for those who can afford the hefty price tag, Microsoft Office SharePoint Server (MOSS) supports server-side Excel spreadsheets that render to HTML. However, while Excel rules the Windows-based OLAP browser space, HTML-based Excel spreadsheets with limited filtering can hardly meet the demand for web-based interactive reporting. Then, there is an option to author Reporting Services OLAP reports but outside limited interactive features their layout is pretty much fixed at design time.

What’s really needed is a Silverlight Analysis Services control that ships with Visual to let developers embed an Excel-like Analysis Services browser into their ASP.NET applications. You need this control and I need it but it’s not coming anytime soon. Meanwhile, third-party vendors rush in to fill the gap.

In a recent project, we used Dundas OLAP Chart for Analysis Services, which I mentioned in one of my previous blogs. Dundas has just released version 7 of this control and I really like what I see. It’s currently the best OLAP-based chart in the industry. The Dundas OLAP Chart for Analysis Services is a regular ASP.NET control with AJAX features for improved user experience. With Silverlight establishing as the web platform of choice for web developers, you may be looking for a Silverlight-based Analysis Services browser. This is where the Intelligencia for Silverlight comes in.

101909_0122_Intelligenc1

I blogged about Intelligencia for Reporting Services before and pointed out where I think it surpasses the MDX Query Designer included in Reporting Services. What IT-Workplace has done this time is bringing their product to the web and it has wisely decided to use Silverlight. The moment I saw the new version, it grabbed my attention. Users familiar with Excel PivotTable reports would find the Intelligenca metadata pane (Field List) very similar. Users can create a report by dragging and dropping objects to the report pane on the right. Actually, I would love to see IT-Workplace adding the Filter, Columns, Rows and Filter panes below it just like Excel. I’d welcome also in-place filtering just like Excel. You got the idea. For me and my users, the more Excel-like the browser is the better. This lowers the learning curve and unifies the desktop vs. web experience.

But Intelligencia for Silverlight is more ambitious than just bringing Excel-like reporting to the web. The control has scriptable interface and a filter control which allows management dashboards to be created by linking grids and filters, as the first link101909_0122_Intelligenc2 on the web page demonstrates.

 

In summary, while still rough around the edges (it’s currently in a beta phase), I think Intelligencia for Silverlight has a lot of potential and is positioned to fill in a wide gap left by Microsoft by letting web developers embed an Analysis Services OLAP browser in their applications. Powered by Silverlight, Intelligencia for Silverlight could bring Excel PivotTables to the web. Visit the Intelligencia for Silverlight web page, download and test the control, and provide feedback to improve the control

Aggregating Many-to-Many Relationships

Relax…this blog is all about Analysis Services and not about polygamy or something worse.

SSAS 2005 went beyond the “classic” OLAP model by introducing flexible dimension-to-measure group relationships, including many-to-many, referenced, fact relationships. M2M relationships can solve some nagging business problems. For example, in a recent project where we had to implement a financial cube, there was a requirement that a G-L account can belong to multiple account groups. This is a classic M2M scenario which can be elegantly solved by introducing a M2M relationship. Adventurizing this, let’s take a look at this schema.

091809_0128_Aggregating1

Here, I changed the Adventure Works schema by adding a DimGroup and DimGroupEmployee tables. The DimGroup dimension is a parent-child dimension that lets the end user slice data by territory. An employee can be associated with one or more territories (think of a sales manager that covers multiple territories. This M2M relationship (groups-territories) is resolved via the DimGroupEmployee table. In the cube, the DimGroup is configured to join the Reseller Sales measure group via a Group Employee factless measure group that is bound to the DimGroupEmployee table. This lets us browse the reseller measures by group.

091809_0128_Aggregating2

So, everything is cool, right? Yes, at least until I realized that the aggregated totals need to be adjusted for some account groups. To demonstrate this, suppose that you need a scope assignment that overwrites the Georgia’s total.

([Group].[Groups].[Georgia], [Measures].[Reseller Sales Amount]) = 10;

091809_0128_Aggregating3

Do you see the problem? The Group1 total didn’t budge. However, if you do the same on a parent-child dimension that joins the measure group via a regular relationship, the total would change. Microsoft declared this behavior with M2M by design. When you come to think about it, it makes sense. To aggregate the measures, the server must go via the bridge table (DimGroupEmployee) to find the qualifying members (DimEmployee) that belong to the member on the one side of the relationship. So, when you make an assignment, you are essentially overwriting this behavior.

If a calculation has a slice in a M2M dimension, it applies to a cell only if the cell coordinate matches the M2M slice. When a cell is at a higher level for the M2M attribute, the calculation does not apply and will not be rolled up to the parent level. Intuitively, M2M attribute is non-aggregatable. In this case, when calculating grand total, the SSAS calculation engine will not sum up the subtotals from each group. However, if the assignment is made on many side (DimEmployee in our case), then the aggregating on DimGroup will work as expected.

Now, of course as a modeler, you would hope of a magic switch to let you roll up the assigned values to align the M2M behavior with regular relationships but for now we have to deal with the “by design” behavior. Microsoft suggested a workaround to introduce a scope assignment that forces the M2M members to aggregate, something like:

SCOPE ([Group].[Groups].Members);  

this =

  IIF (IsAncestor ([Group].[Groups].CurrentMember, [Group].[Groups].[Georgia]),

       sum([Group].[Groups].CurrentMember.children),

     ([Group].[Groups].CurrentMember, [Measures].[Reseller Sales Amount])

     );   

End
scope;

This works but it may introduce a performance penalty with larger dimensions. Instead, in our case, we decided not to use an assignment at all. We introduced a new helper measure to the measure group (FactResellerSales in this case) that would perform the required calculations. Since the measure values are mapped multiple times to the many side, we use simple assignments to zero out the measure for the groups that it doesn’t apply.

Passing Multivalued Parameters in SQL Server 2008

One year after submitting this small article, SQL Server Magazine has finally published it. This article demonstrates how you can use table-value parameters (TVPs), a new feature in SQL Server 2008, to let you pass multivalued parameters from a report to a stored procedure.

The code can be downloaded from the SQL Server Magazine site or from here.

Heat Maps as Reports

Continuing my intrepid journey through the new Reporting Services R2 enhancements, in this blog I’ll demonstrate some of the cool map features. As you’ve probably heard, R2 brings a brand new map control that lets you visualize spatial data on your reports. Since mapping is the one of the major enhancements in R2, there will be plenty of resources to cover it in details. For example, Robert Bruckner has written a great blog to get you started with mapping. The SSRS R2 forum adds more resources.

But what if you don’t need to visualize geospatial data, such as restaurants in the Seattle area? You shouldn’t bother with the map, right? Not so fast. What’s interesting is that the map supports the two spatial data types in SQL Server: geography and geometry. The latter lets you visualize everything that can be plotted on the planar coordinate system. That’s pretty powerful when you get to think about it. If you have the item coordinates, you can map pretty much everything. A few weeks ago, for example, I saw a Microsoft sample map report that plotted floor layout. Today, I’ll show you a report that I whipped out during the weekend that demonstrates how to visualize a heat map.

083009_2119_HeatMapsasR1

The Heat Map Report

A heat map is a graphical representation of data where the values taken by a variable in a two-dimensional map are represented as colors. The most basic variation of a heat map is a tree map that lets you visualize data trends presented as nested rectangles. In our case, the sample report shows the Adventure Works reseller sales. As shown in the screenshot, the larger the rectangle is the more sales were contributed by that reseller. The same can be observed by analyzing the rectangle color which ranges from green (most sales) to red (less sales). The greener the color is, the more sales that reseller has. I chose to use the sales as a color variable although I could have used another measure, such as number of employees, profit, etc. You can download the source code from here.

Limitations

I’ll be quick to point out the following limitations of this reporting solution:

  1. As it stands, producing the report requires a two-step approach. First, you need to prepare the dataset and save it in a SQL Server table. Second, you run the report from the saved data. Unfortunately, the ReportViewer control doesn’t support RDL R2 which means it doesn’t support maps, so you cannot bind the dataset and run the report in one shot. While we are waiting for Microsoft to update ReportViewer, you can use a server-side custom data extension that exposes the ADO.NET dataset with the spatial data as a report dataset. This approach will et you bind the ADO.NET dataset to a server report. Chapter 18 in my book source code includes such an extension.
  2. The performance of the algorithm that I use to calculate the coordinates of the nested rectangles degrades after a few hundred rows in the dataset. If you need to map more results, you may need to revisit the algorithm to see if you can optimize it.

Understanding Rectangle Geometry

SQL Server describes polygons in Well-Known Text (WKT) standard sponsored by Open Geospatial Consortium. For instance, a polygon is described as five (surprise) points on the coordinate system, as shown below. The fifth point is the same as the first point.

083009_2119_HeatMapsasR2

You can convert WKT to the SQL Server geometry data type by using the STPolyFromText method, such as:

geometry::STPolyFromText(‘POLYGON ((0 0, 0 125.6331, 115.0095 125.6331, 115.0095 0, 0 0))’, 0)

Calculating the Coordinates

To calculate the nested rectangle coordinates, I used the excellent Squarified Treemaps algorithm by Jonathan Hodgson. Jonathan explains in details how the algorithm works. My humble contribution added the following changes:

  1. I decoupled the algorithm from his Silverlight solution to a C# console application.
  2. Instead of loading the sample dataset from a XML file, I load it from the Analysis Services Adventure Works cube. Specifically, the MDX query returns the top fifty resellers order by Reseller Sales.
  3. I added Console.WriteLine statements to output insert queries which you can execute to populate the Test table.

CREATE TABLE [dbo].[Test](

    [ID] [int] IDENTITY(1,1) NOT NULL,

    [Name] [varchar](50) NULL,

    [Shape] [geometry] NOT NULL,

    [Size] [decimal](18, 5) NULL,

    [Area] AS ([Shape].[STArea]()),

CONSTRAINT [PK_Test] PRIMARY KEY CLUSTERED

(

    [ID] ASC

)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]

) ON [PRIMARY]

GO

insert into test(Shape, Size, Name) values (geometry::STPolyFromText(‘POLYGON ((0 0, 0 125.6331, 115.0095 125.6331, 115.0095 0, 0 0))’, 0), 877107.2, ‘Brakes and Gears’)

insert into test(Shape, Size, Name) values (geometry::STPolyFromText(‘POLYGON ((0 125.6331, 0 247.9348, 115.0095 247.9348, 115.0095 125.6331, 0 125.6331))’, 0), 853849.2, ‘Excellent Riding Supplies’)

Once you execute the INSERT statements to populate the Test table, you can use the SQL Server spatial data visualizer to view the results:

083009_2119_HeatMapsasR3

Authoring the Report

Once you have the data, authoring the report is easy.

  1. Create a new report in Report Builder 3.0 Add a dataset that uses the following query:
    SELECT * FROM Test
  2. Click the Map icon to start the Map wizard.
  3. On the Choose a Source for Spatial Data step, choose the SQL Server Spatial Query option and click Next.
  4. On the Choose the Dataset step, select the Dataset from step 1.
  5. On the Choose Spatial Type, the map wizard should identify the Shape column as a geometry data type. At this point, the pre-release Report Builder 3.0 shows an empty map because it doesn’t recognize the coordinate system as Planar. Don’t worry, we’ll fix this later.
  6. On the Choose Map Visualization, select the Color Analytical Map option.
  7. On the Choose the Analytical Dataset step, select the same dataset because it has a Size column (reseller sales) that we will use for the color visualization.
  8. On the Choose Color Theme step, choose =Sum(Fields!Size.Value) for a field to visualize and Red-Green as a color rule because the resellers with less sales will be coded red. Click Finish to exit the wizard.
  9. In design mode, select the map and change the Viewport CoordinateSystem property to Planar.

083009_2119_HeatMapsasR4

Once you’ve changed the coordinate system, the polygons should magically appear. I’ve made a few more changes but you should be able to understand them by examining the map properties.

During my Reporting Services Tips and Tricks TechEd 2009 presentation, one of my most important tips was to encourage you to write more reports and less custom code. I showed a cool SharePointb-ased dashboard that we built by assembling report views. Following this line of thought, consider report maps instead of writing custom code when you need to visualize spatial data of any kind.

Yet Another Relative Dates Implementation

Yet another relative dates implementation in an Analysis Services cube. So far, I’ve implemented two. But I’m now starting a project to implement an accounting (G/L) cube and the traditional relative date approach where each relative date is calculated for the current member in the Date dimension doesn’t fly anymore. Heck, the business analyst even found the excellent “Translating Cognos PowerPlay Transformer Relative Date Functionality into SQL Server Analysis Services (SSAS)” whitepaper by David Greenberg and told me that’s exactly what they need. That’s what happens when you have smart users. Power Pivot anyone? While the business users are not empowered yet, let me share how I implemented this approach.

What’s Needed?

The difference now is that we want to drill down from each relative date to the periods that the relative date spans. Let’s take a look at the screenshot below.

082009_0151_YetAnotherR1

The cube sets the default member of the Date hierarchy to the accounting closing period, which is May 2008 in this case. The Current Date relative date shows the actual measure values for May 2008 only. The Current QTD shows the actual measure values for April and May 2008. The current QTR shows the actual measure values for April, May, and June, etc. In other words, when the user slices by Relative Dates and Date dimension, the system shows the actual measure values for the periods that belong to that relative date. Consequently, the only way for the user to get the calculated relative dates values, such as the Current YTD aggregate is to force the Date dimension to the [All] member but that’s fine with the business users.

Why We Disagree?

My implementation differs from the above-mentioned whitepaper in the following ways:

  1. I use a regular Relative Dates dimension instead of dimension calculated members that the SSAS BI Wizard and the whitepaper use. It looks to me that OLAP clients compete with each other to butcher dimension calculated members the most. Excel, for example, doesn’t retrieve calculated members by default and it doesn’t let you select individual calculated members. Other tools ignore them whatsoever. To avoid such pitfalls, I defined the Relative Dates dimension as a regular dimension.
  2. The whitepaper defines extra columns in the DimDate table that flag the qualifying dates. My implementation doesn’t do this because it’s not needed.
  3. The whitepaper produces the relative dates results by aggregating the measure across the qualifying date members. My approach uses simple scope assignments as you will see in a moment.

How This Got Implemented?

The implementation of this Relative Dates approach is remarkably simple.

  1. Implement a SQL View on which the Relative Dates dimension will be based that hardcodes the Relative Dates members with SELECT…UNION statements. Note that first row on which will join the Date dimension doesn’t have a name because we will hide it in the next step.

082009_0151_YetAnotherR2

  1. Set up a Relative Dates dimension. Here I use a little trick to hide the first member because the user shouldn’t see it. Specifically, I set the AttributeHierarchyVisible of the Relative Date dimension key to False. Then, I created a Relative Dates hierarchy and set the HideMemberIf property of the Relative Dates level to NoName.

    082009_0151_YetAnotherR3

  2. I added a named calculation called RelativeDateKey to the DimDate table in the data source view with the expression value of 1 so I can relate the Relative Dates and Date dimensions.

082009_0151_YetAnotherR4

  1. Next, we relate the Relative Dates dimension to all associated measure groups with a referenced relationship via the Date dimension.

082009_0151_YetAnotherR5

  1. Time for some MDX programming. We use script assignments to populate the Relative Dates cells. By default, all cells intersecting with all Relative Dates members (except for the RelativeDateKey member) are empty. So, we just need to “carry over” the measure values from the qualifying members of the Date dimension. Here is what the first three assignments look like:

— Current Date

Scope ([Relative Dates].[Current Date], [Date].[Date].DefaultMember);

this = [Relative Dates].&[1];

End Scope;

 — Current QTD

Scope ([Relative Dates].[Current QTD], Exists(PeriodsToDate([Date].[Date Hierarchy].[Quarter], [Date].[Date Hierarchy].DefaultMember), [Date].[Date].[Date].Members));

this = [Relative Dates].&[1];

End Scope;

 — Current QTR

Scope ([Relative Dates].[Current QTR], Exists(Descendants(Ancestor([Date].[Date Hierarchy].CurrentMember, [Date].[Date Hierarchy].[Quarter]), [Date].[Date Hierarchy].[Date]), [Date].[Date].[Date].Members));

this = [Relative Dates].&[1];

End Scope;

Prior to these assignments, the cube script sets the default member of the Date hierarchy to the accounting closing period. Each assignment scopes on the qualifying members of the Date dimension and sets the scope to [Relative Dates].&[1] , which as we’ve seen is the first member of the Relative Dates dimension.

Why Does This Work?

The script carries over the measure values to the Relative Date members that intersect with the members of the Date attribute hierarchy. From there on, the measure aggregation function kicks in to roll up the values, e.g. by month, year, and quarter.

Reporting Services 2008 Online Training Class with Teo Lachev

There is still time to sign up for our Applied Reporting Services 2008 class with Teo Lachev. This three-day intensive event teaches you the knowledge and skills you need to master Reporting Services to its fullest. No travel, no hotel expenses, just 100% in-depth SSRS training delivered right to your desktop!

072809_2347_AppliedRepo1Applied Reporting Services 2008 Online Training Class
with Teo Lachev
Date: August 31 – September 2, 2009
Time: Noon – 4 pm EDT; 9 am – 1 pm PDT
12 training hours total for only $799!
072809_2347_AppliedRepo2

Attend this class for only $799 and get a free paper copy of the book Applied Microsoft SQL Server 2008 Reporting Services by Teo Lachev!

For more information or to register click here!

SQL Server 2008 R2 August Community Technology Preview Available

Microsoft released SQL Server 2008 R2 August Community Technology Preview which includes the Report Builder 3.0 R2 August CTP redistributable.