Swapping the RS Catalog

Sometimes, you may need to swap the RS catalog database. For example, you may need to clone the catalog database from a production machine to troubleshoot an issue. One way to do so is to replicate the report items (report definitions, data sources, images, etc.). Another, and perhaps faster, way is to copy the report catalog database and redirect the Report Server temporarily to the new catalog by using the handy Reporting Services Configuration Manager (new with RS 2005). The following steps worked for me to redirect my local SSRS instance (SP1+hotfix) to a different catalog (from SSRS 2005 RTM).

  1. Use the SQL Server Management Studio to create a new database, e.g. ReportServerTest.
  2. Create a new database ReportServerTestTemp to serve as the Report Server Temp database.
  3. Back up the production database you need to troubleshoot.
  4. Restore the production database to ReportServerTest. Make sure to rename the database files by flipping to the Options tab (e.g. rename the files to ReportServerTempDB.mdf and ReportServerTempDB_log.mdf)
  5. Back up the production instance of the ReportServerTemp database and restore it to ReportServerTestTemp (again rename the database files).

Note: As its name suggests, the ReportServerTemp database is just a temporary database used by the Report Server for session caching. Since you don't need its data, you can restore (copy) it from any RS temp database.

  1. Open the Reporting Services Configuration Manager (Microsoft SQL Server 2005 program group -> Configuration Tools -> Reporting Services Configuration) and switch to the Database Setup tab (see attached screenshot). 
  2. Enter the name of the new Report Server database (ReportServerTest) and hit Apply. If your SSRS instance is newer version of SSRS 2005 (e.g. SP1), you will be prompted to upgrade. If all is well and you don't see error and warning messages, the Report Server will be re-configured to use the new database. If at this point you try to connect to the Report Server (e.g. /reportserver/) you will get the rsReportServerNotActivated error. The reason for this is that the Report Server cannot decrypt the content of the new catalog with its encryption key.
  3. In the Reporting Services Configuration Manager, flip to the Initialization tab. You should have two report two rows (see attached file) in the grid corresponding to your local computer name (initialized) and the computer which hosted the report catalog you restored. Let's assume you don't have connectivity to the remote machine and you cannot connect to it.
  4. Open the command prompt and execute:

     "C:\Program Files\Microsoft SQL Server\90\Tools\Binn\RsKeyMgmt.exe" -d

This will effectively remove any encrypted content, including connection and subscription credentials.

Note: If you have a backup of the encryption keys from the source server, to prevent losing the encrypted content you can restore the encryption keys on the target server by executing the following commands in the order shown: rskeymgmt –e, rskeymgmt –r, and then rskeymgmt –a.   

  1. At this point the Report Server should be initialized. You may need to go back to the Initialization tab and click Initialize (you should have only row with your computer name).
  2. You can now open the Report Manager and change the connection strings for the data sources .
  3. Once you done testing, go back to Reporting Services Configuration Manager and flip the catalog database to the original ReportServer database.

E-mail Unleashed

Many, many apologies if you have received tons of e-mail from me (Prologika) during the weekend. The good news is that it seems like I’ve managed to find a way to send e-mail notifications from my forum. The bad news is that I didn’t notice that I had some 500 e-mails stuck in the queue which the e-mail job sent out. Oops!

UDM Data Islands

As I mentioned in my TechEd Memoirs post, Microsoft is scaling back from the “super” cube approach in favor of more granular data “islands”. Since this took me by suprise, I approached the SSAS team for clarification. Here are the answers I managed to get:


 


1. What performance benefits could be realized by splitting UDM in data islands?


At this point, we don’t have detailed performance numbers we can share. But to give an indication – a cube that had around 15 measure groups, and a very large number of attributes (over 100 cube dimensions, many with more than 100 attributes) showed a performance improvement of about 6 fold when split into separate cubes (one per measure group), for queries that only referenced a single measure group. The main benefit came from the reduced cube space – the individual cubes only needed to include dimensions related to the measure group, and therefore paid no overhead for unrelated dimensions. There was also some benefit from the MDX script being simpler for each of the individual cubes.


 


2. What guidelines should be followed when partitioning UDM?


The general principle is to have multiple cubes, maybe even one per fact table. Then have other cubes, using linked measure groups that can combine into the complete view. Use the small cubes when they meet the needs, and query the full cube when need to use measures from >1 measure group.   


 


Some general guidelines are:


– If the dimensionality of all the measure groups is (much) the same, it would limit the benefit of splitting up (though there is some benefit of the simpler MDX script per cube)


– If queries very commonly reference measures from >1 measure group, then there would be no value in splitting those measure groups into separate cubes


– The perf benefit must be balanced against the negative impact on the end user view, if frequently a question using measures from just one measure group becomes a question where a measure from another measure group is added. For the user to switch cubes in the middle would be tiresome


– A single large cube is easier to manage, as there is no need to duplicate scripts


 


For calculations that span cross measure groups e.g. where many measure groups have calculations using exchange rate, certainly would not want to use LookupCube. Could either keep all those measure groups in the same cube, or split up and again use linked measure groups to utilize exchange rates in multiple cubes


 


3. By a “data island” do you mean leveraging the linked dimensions and measure group features? If so, any known limitations and restrictions?


Yes, uses linked measure groups, as described above. The dimensions are simply shared, and need not be linked. As far as limitations, there is overhead of having to define calculations etc. on the cube that combines the smaller cubes. Also, Report Builder does not handle such databases that well, as the linked measure groups appear as separate entities in the report model (there is nothing saying the Sales in one cube is exactly the same as Sales in another)


 


4. When can we expect a UDM performance whitepaper?


One is being worked on.

You’ve been Deadlocked

If you’ve been using VS.NET 2005 for a while chances are that your debugging session could have been spectacularly crashing just when you thought you were so close finding that elusive critical bug. This situation may have manifested with the following exception:


ContextSwitchDeadlock was detected
Message: The CLR has been unable to transition from COM context <some nasty hex number> to COM context <another nasty hex number> for 60 seconds. The thread that owns the destination context/apartment is most likely either doing a non pumping wait or processing a very long running operation without pumping Windows messages. This situation generally has a negative performance impact and may even lead to the application becoming non responsive or memory usage accumulating continually over time. To avoid this problem, all single threaded apartment (STA) threads should use pumping wait primitives (such as CoWaitForMultipleHandles) and routinely pump messages during long running operations.


Usually, you can’t recover from this exception and the only way to be able to debug again is to restart the debugger (that is until it crashes again). The bad news is that this is a bug with the ContextSwitchDeadlock managed debugging assistant which the VS.NET team couldn’t fix in time. The good news is that you can prevent this MDA from raising his ugly head by ever again by just going to the Exceptions dialog (doesn’t appear by default in the Debug menu but you could add it by customizing the Debug menu) and disable the ContextSwitchDeadlock MDA found under the Managed Debugging Assistants category.

TechEd Memoirs

TechEd 2006 US is over. Given the fact that VS.NET 2005, .NET 2.0, and SQL Server 2005 have matured, there weren’t many earth-shuttering announcements and cool demos. Anyway, here are some of the most interesting BI highlights written in the plane on my way home.


Brian Welcker told us that RS SP2 (CTP scheduled for November, final release alongside Office 2007 launch) will bring a much tighter level of integration between RS 2005 and SharePoint 2007.  This is great given that SharePoint is the Microsoft premium portal solution. Specifically, it would be possible to configure RS in a SharePoint configuration (SSP) mode using the RS configuration utility. In SSP mode, the entire RS catalog is migrated to the SharePoint database. Instead of the Report Manager, the SharePoint Administration portal will be used for report management. For example, the Report Server will authenticate the users using the SSP security settings. The SSP Report Viewer (a wrapper around the ASP.NET Report Viewer) will make the report viewing experience more SharePoint-like. For example, the report parameters will be presented in a dropdown list to minimize the horizontal space.


Jamie MacLennan from the SSAS team did a great presentation about driving report payloads when UDM is used as a data source. What impressed me most was that Jamie gave us an honest account about some of the problematic integration areas instead of a let-me-show-you-how-cool-everything-we-did-is type of a presentation. For example, he pointed out that SSAS hierarchies don’t have Report Builder equivalents. Not only parent-child dimensions are not supported in the Report Builder, but they will give wrong results. He recommended avoiding parent-child hierarchies in the Report Builder whenever possible or replacing them with flattened regular hierarchies in UDM. Jamie shared great tips about associating a calculated member with a measure group so it appears in the Report Builder and disabling the automatic Report Builder drillthrough reports when UDM is used a data source for performance reasons. The biggest surprise for me came when he stated that Microsoft now recommends splitting a large UDM in smaller models that leverage the linked dimension and measure group features. As performance tests by the SSAS team have demonostrated, this approach may result in better memory utilization and performance, as well as easier management.


T.K. Anand and Dan Battagin did a great demo of SSAS-Excel 97 integration. No big surprises here if you follow the David Gainer’s blog. T.K. Anand and Christian Pesculescu took us behind the scenes of the UDM storage model and shared very interesting details about how dimension and measure are stored. Later, they did another excellent presentation that focused on maximizing the UDM performance. They emphasized the importance of designing proper attribute relationships and avoiding duplicate attribute relationships. It was mentioned that there will be a SSAS performance whitepaper in near future.


I was also given a personal sneak preview of the forthcoming PerformancePoint server. As I noted in a previous blog, PerformancePoint will unite three products — Scorecard Manager, ProClarity, and Biz#. I asked the presenter to focus on the Biz# portion only. It realized that the client interface will be implemented in Excel 2007. In fact, it seems like the Microsoft Office team expects people to build BI-centric applications inside the Office products after retiring OWC. The Performance Point client will support comprehensive models for planning and forecasting. Behind the scenes, these functions will be carried out by the server layer of PerformancePoint which creates SSAS cubes to save the model data.


Finally, I think my presentation (Building Reporting Solutions with SSRS and SSAS) went really well. Hilton Logan (the hotel I stayed) surprised us with a false fire alarm in the wee hours of the morning (around 3:30 am) and decided to evacuate everybody outside. So, I went on stage quite sleep-deprived since I couldn’t get more sleep after the accident (oh, yeah, we had an MVP party the previous night). The presentation took place in one of these vast conference rooms that make me proud that I am an American. It could easily take 500 people and it had four large screens. Despite the early hour (8 am), about 200 people showed up to boot with a healthy dose of Microsoft BI. I got a few interesting questions during the presentation and after it in the Business Intelligence discussion area.


As always, TechEd was a great show. Some 12,000 people attended the conference and it was sold out well in advance. I liked Boston too as I enjoy all cities where one can walk instead of sitting paralyzed in a traffic jam. Unfortunately, Atlanta is not one of them 🙁

Next stop – TechEd 2006 in Boston

I am gradually tuning to the TechEd wave. I am leaving for Boston on Monday. I hope those of you who will attend TechEd will stop by to my session “Building Reporting Solutions with Reporting Services and Analysis Services”. I’ve already uploaded the slides and code to my website.


The presentation is currently scheduled for 8 am on Thursday, June 15th in room 210. As the adage goes, an early bird early sings :-).  Hopefully, SSRS and SSAS is just what you need to boot up after the party on Wednesday. Time permitting, I will also hang around one or more of the BI discussions in the Discussion area (wherever that is).


See you in Boston!

When the Host is not so Perfect

Here is something that has recently bit me really bad. Customer requirements called for implementing an in-house report designer to create report definitions. Expired by the VS.NET 2005 Report Designer, we decided to implement a part of the tool as a WYSWYG designer using the design-time infrastructure (IDesignerHost) in .NET 2.0. If you don’t know what I am talking about, read the excellent Dinesh Chandnani’s Perfect Host article to learn more about the .NET designer host support.


To spice up the user experience, we decided to use the Infragistics Windows Forms suite. At runtime, the end user could drag Infragistics UltraTextBox and UltraImage controls and drop them on the design canvas. Everything was working just fine during development (aka my machine). However, once the application was deployed to QA, the WYSWYG designer failed abysmally. After some digging, we realized that the Infragistics controls was performing the same license check as they do when dropped on a Windows Form in VS.NET 2005. Since the Infragistics controls fail to find a design-time license, they throw an exception once the user attempts to site the control and there wasn’t any workaround. We had no other choice but to yank out the Infragistics controls and replace them with the plain-vanilla Windows controls – TextBox and PictureBox.


The moral of this story is to avoid using third-party controls when implementing .NET custom designers to prevent licensing gotchas.

Microsoft’s Performance Management Gambit

An interesting article on TDWI about PerformancePoint Server.

PerformancePoint Server Announced

As a follow-up to previous post, during the webcast Microsoft announced a new product called PerformancePoint Server due to be released in mid-2007 alongside Office 2007. PerformancePoint will be a convergent product of Scorecard Manager, ProClarity, and Biz# (yes, Microsoft finally let the cat out of the bag about the strictly confidential so far Biz# initiative). The product will be designed as a web-based portal and will offer performance management, planning and forecasting capabilities.


I personally very excited about this new product offering. PerformancePoint will definately help Microsoft to compete better with other heavy-weight BI players, such as Cognos. More information about PerformancePoint can be found on its website.

Microsoft’s Business Intelligence Product Roadmap Webcast

Jeff Raikes, President of Microsoft’s Business Division and Lewis Levin, Corporate Vice President of Microsoft’s Office Business Applications Group, will present a Live Meeting webcast at 11 am (PDT) regarding Microsoft’s Business Intelligence product roadmap. Click here to register.