SQL Server 2008 R2 Released to Manufacturing

Robert Bruckner announced that SQL Server 2008 R2 was released to manufacturing today. Robert summarized the most important most features. I covered them in my blog as well. I’ll update this blog once I find out when R2 will be available on MSDN.

UPDATE

According to this Microsoft blog, R2 will be available for MSDN subscribers on May 3rd.

Using XPerf to Test Analysis Services

I have to admit that I am a performance buff. In a previous blog, I mentioned that the Xperf utility can help you gain additional understanding about the Analysis Services performance. Akshai Mirchandani from the SSAS team was kind enough to share his performance testing experience and recommended this utility. It turns out he was right – it is a real gem! Just after writing that blog, an opportunity presented itself. A business user complained that drilling down a member in an Excel PivotTable report takes about 20 seconds on a virtual test server. This inspired me to take XPerf for a spin. In this case, the database was relatively small but the user has requested two parent-child dimensions on the report with some 3,500 and 300 members each. As you probably know, parent-child dimensions are slow by nature. To make things worst both dimensions had many-to-many relationships. The SQL Profiler would show that almost the entire time was spent reading data from partitions. However, I was interested to find out how much time was spent in actual I/O vs. aggregating data to make specific performance recommendations. The findings were interesting. Here are the steps I followed to test the performance using XPerf on the virtual server running Windows Server 2008:

  1. I downloaded the Microsoft Windows SDK for Windows 7. Although it’s for Windows 7, it can run on Windows Server OS as well. When I ran the web install, I selected the Win32 Development Tools option only to reduce the size of the download.
  2. Once installed, I ran Install Windows Performance Tool Kit (x64) found in the Microsoft Windows SDK v7.0 -> Tools program group and accepted the default options.
  3. I ran the command prompt with elevated privileges (Run As Administrator) and started a trace with the default configuration.
    C:\Program Files\Debugging Tools for Windows (x64)>xperf -on DiagEasy
  4. In Excel, I drilled down to expand a member of one of the parent-child dimensions that was particularly slow.
  5. Once Excel was done executing the query, I stopped the trace and saved the results in a file mytrace.etl.
    C:\Program Files\Debugging Tools for Windows (x64)>xperf -d mytrace.etl
  6. Double-clicked the mytrace.etl file which was saved in the C:\Program Files\Microsoft Windows Performance Toolkit folder to open in the Windows Performance Analyzer. This is where the fun starts!

    041610_2234_UsingXPerft2

  7. Since I was interested in disk utilization and CPU usage, I clicked on the Flyout glyph and checked these options to have two graphs only.
  8. On each graph, I expanded the Processes panel and left only msmdsrv.exe checked to filter the trace results for Analysis Services server only.

    At this point, I couldn’t believe my eyes! As you could see, the server spent a fraction of a second to read from disk. The remaining time was spent in number crunching to aggregate the results. To see the actual metric, I selected the I/O peak, right-clicked and clicked Summary Table. This showed that the server spent 29ms to read only 118,784 bytes from disk which makes sense (as I said the database is small). In summary, this particular case could benefit much more from more CPU power and potentially more CPUs since the storage engine is highly parallel.

    041610_2234_UsingXPerft3

    As you can see, Xperf can help you gain important performance details that are not available in SQL Profiler. In this blog, I only scratched the surface of what XPerf can do for you. One very interesting feature is stack walking that gives you a breakdown in time spent per function call assuming you have the debug symbols loaded. The stack waking feature is explained in more details in this Windows Performance Toolkit – Xperf blog. Of course, it could have been nice if the SQL Profiler gives you more detailed tracing information so you don’t have to rely on Xperf, e.g. Reading Data From Disk Begin/End, Data Cache Population Begin/End, Storage Engine Work Begin/End, Formula Engine Work Begin/End. So, vote here.

Speaking at Atlanta.MDF

I will be speaking at the Atlanta Microsoft Database Forum group (Atlanta.MDF) on May 10th in Atlanta, GA . The topic will be What’s New in Reporting Services 2008 R2. Given that SQL Server 2008 R2 (aka SQL Server 2010) will be released in May, timing is good. This is your chance to learn about and see the cool new features in action, ask tough questions and literally put me on the spot. Save the date in your calendar.

Date: 5/10/2010

Time: 6:30 PM – 9:00 PM

Place:Microsoft, 1125 Sanctuary Pkwy., Suite 300, Alpharetta, GA 30004

Analysis Services I/O Time

A couple of years ago I was working on a KPI dashboard project where we had to display a bunch of KPIs produced by counting (distinct count) number of customers, accounts, etc. Even with SSAS 2008 we couldn’t much to optimize the queries to render the page in a few seconds because we found that 50% of the execution time was spent in the storage engine. We flirted with the idea of using solid state disks (SSD) but back then there were not that popular.

A recent feedback from other MVPs and members of the SSAS team suggests that in general SSD are favorable for optimizing the SSAS I/O performance. As you would imagine, the more time the Storage Engine (SE) spends in reading data from disk, the more performance improvement you would expect by switching to faster disks. But then the question becomes how much time SSAS really spends reading data from disk. As it turns out, the SQL Profiler doesn’t give you the answer. That’s because the Started/Finished reading data from the partition events include also the time spent in aggregating data and this time may be significant especially in the case of parent-child or many-to-many dimensions. So, judging from the SQL Profiler alone, you may think that most of the query time is spent in I/O where faster disks may help while it may turn out that you may need more CPU power.

Since as it stands the SQL Server Profiler doesn’t break down the SE time, you need to rely on Windows performance counters to find how much time was spent in actual I/O. Event Tracing for Windows (ETW) can give you such information. For example, on Windows 2008 and Windows 7 you can use the Xperf tool from the Windows Performance Toolkit. The following resources should get you started with Xperf:

Windows Performance Toolkit – Xperf

Using Xperf to take a Trace

Two Minute Drill: Introduction to XPerf

As you will find out, the information emitted by XPerf could be overwhelming to answer a simple question. I posted a feedback on connect for a future release of SSAS to include more detailed tracing about the SE inner workings. This information should be available in SE so it shouldn’t be that difficult to break down the Progress Begin/End reports so customers know if they should scale up in terms of CPU or I/O. Please vote for it if you find it useful.

http://connect.microsoft.com/SQLServer/feedback/details/550431/provide-more-detailed-strorage-engine-trace

Feeding Reports by Code

In a previous blog, I introduced one of the new SSRS 2008 R2 feature: reports as data feeds. As I said, the R2 release will include a new Atom Data Feed renderer to produce an Atom service document (*.atomsvc) which defines a feed(s) per data region. I also said that the main scenario for report feeds was to let the PowerPivot add-in for Excel (previously known as Gemini) consume report data. Recently, I’ve watched the interesting Pablo Castro’s “ADO.NET Data Services: What’s new with the RESTful data services framework” presentation which got me inspired to find a way to consume a report data feed programmatically outside PowerPivot. Since the Atom feed format is pre-defined, a custom application could benefit from this scenario to retrieve and manipulate the report data. In this case, an atom feed may be preferable than other format, such as CSV, because data is exposed as properties instead of columns.

So, I embark on a journey to write a small Silverlight app, RSFeedReader, that reads and displays a report feed. You will need Visual Studio 2010 to open the RSFeedReader source code. Before running the application, you may need to deploy the clientaccesspolicy.xml file included in the source to the IIS root, such as c:\inetpub\wwwroot or C:\inetpub\wwwroot\wss\VirtualDirectories\80 if you have SharePoint installed on port 80, to grant the application rights to connect to Reporting Services. If you don’t, Silverlight would treat the application URL, which runs under the local ASP.NET IIS server, and Report Server URL as two different domains and it will throw a security exception.

You can enter a report feed in the textbox and hit Fetch. The data feed results are shown in the grid. Similar to PowerPivot, each row represents a row in the report data region but the properties are stacked vertically instead of horizontally. The first column shows the entry identifier which identifies the row in the data region and the second column shows the values in that row. The Generating Data Feeds from Reports BOL topic covers the technical details of report feeds pretty well. Basically, the feed URL must include a unique identifier which you obtain from the Atom service document when you export the report to the Atom Data Feed format. The default feed URL points the matrix region in the Company Sales report although you can’t tell this from the feed URL. Again, you need to get the DataFeed identifier for the region from service document after exporting to Atom. If you request the report data feed in the browser, you will see the resulting XML. The most interesting information is included in the m:properties element.

Unlike SharePoint, Reporting Services doesn’t provide a WCF endpoint, such as ListData.svc, so you can’t generate a Visual Studio proxy (unless you do it manually). Instead, RsFeedReader, uses WebClient to submit the URL request to Reporting Services to obtain the feed.

protected void LoadFeed(string uri)

{

WebClient wc = new WebClient();

wc.OpenReadCompleted += new OpenReadCompletedEventHandler(wc_OpenReadCompleted);

Uri feedUri = new Uri(uri, UriKind.Absolute);

wc.OpenReadAsync(feedUri);

}

private void wc_OpenReadCompleted(object sender, OpenReadCompletedEventArgs e)

{

if (e.Error != null)

{

txtFeedLoc.Text = “Error load report data feed. “ + e.Error.Message;

return;

}

using (Stream s = e.Result)

{

SyndicationFeed feed;

 

using (XmlReader reader = XmlReader.Create(s))

{

// read the feed

feed = SyndicationFeed.Load(reader);

var posts = from item in feed.Items

select new FeedEntry(item.Id, (XmlSyndicationContent)item.Content);

 

// show feeds in grid

dGrid.ItemsSource = posts;

dGrid.Visibility = Visibility.Visible;

}

}

}

 

Once the asynchronous call completes, RsFeedReader loops through the feed entry elements and populates a collection of FeedEntry objects which I bind to the grid. The FeedEntry class parses the content element of the entry (where the properties are located) and exposes them as a ContentProperties object (not shown) which I bind to the nested grid in the LoadingRow event of the outer grid.

It probably won’t be long before various custom applications start popping up that lets end users subscribe to feeds, such as to let the end user get notified when report content changes.

Remembering Report Parameters

Scenario: You want to let end users view a report in Report Manager or SharePoint and “remember” the report parameter selection. You find that you cannot just tell the users to add the report URL to the browser favorites because the parameters are not exposed on the URL. This is by design and cannot be changed. Further, you discover that for some reason end users prefer to play Solitaire instead of fiddle with the URL syntax to construct the report URL manually.

Solution: The Perimeter Persistence report demonstrates an implementation approach that Reporting Services 2008 makes possible. The user can click on the Add Favorite hyperlink. This prompts the user to add a favorite. When the user clicks the Add button, the report URL including the parameter selection is added to the browser favorite.

032610_0216_Remembering1

 

Implementation: I followed these steps to implement the Add Favorite hyperlink that injects some JavaScript into the hosting page to get the job done:

1.  Add a textbox that uses the following expression:

= Iif (NOT Globals!RenderFormat.IsInteractive, Nothing,

“<a href=””javascript:addFavorite(‘” & Globals!ReportServerUrl & “?” & Globals!ReportFolder
& “/” & Globals!ReportName
& “&ProductCategory=” & Parameters!ProductCategory.Value
& “&ProductSubcategory=” & Join(Parameters!ProductSubcategory.Value, “&ProductSubcategory=”)
& “‘,'” & Globals!ReportName & “‘)””>Add Favorite</a>”

The script calls a JavaScript function called addFavorite which you will find in the script.js file. The expression passes a valid report URL, such as:

http://localhost/ReportServerR2?/TechEd/Perimeter Persistence&ProductCategory=1&ProductSubcategory=2

Unfortunately, you cannot make the link “generic” because as it stands Reporting Services doesn’t let you loop through the Parameters collection. If you use SQL Server 2008 R2, you can use the new Globals!RenderFormat.IsInteractive built-in field to check the requested export format and disable the link if the renderer doesn’t support interactive features. Use the Join function if you have a multivalue parameter.

2.  Select the entire expression in the textbox, right-click and choose Placeholder Properties and click the HTML – Interpret HTML Tags as Styles to format the resulting text as a hyperlink. Isn’t SSRS 2008 rich formatting nice?

3.  Copy the content of script.js and append to the end of the source of the following pages.

\Reporting Services\ReportManager\Pages\Report.aspx
\Reporting Services\ReportServer\Pages\ReportViewer.aspx

Why you need to add the script to two pages? Because when viewing the report in Report Manager, the hosting page is Report.aspx but when the user clicks on the saved link, the URL points directly to the server and the ReportViewer page is used. If you prefer the link to point back to the Report Manager, change the expression to request the Report.aspx page. If you target SharePoint, the URL should point to the RSReportViewer.aspx page using the syntax explain in this blog by Prash Shirolkar.

Product Release Galore

Microsoft announced release dates for SQL Server R2, SharePoint 2010, Office 2010, and Visual Studio 2010.

  • According to the Microsoft SQL Server Team Blog: SQL Server 2008 R2 (aka SQL 2010) will be listed on Microsoft’s May price list, and will be available by May 2010
  • According the Microsoft SharePoint Team Blog: SharePoint 2010 and Office 2010 will be officially launched on May 12, 2010 with an RTM date of April 2010
  • Visual Studio 2010’s release date was pushed to April 12, 2010 due to some performance issues according to Developer Division’s Marketing and Communications Manager Rob Caron’s blog.

2010 is sure a busy year for BI!

Chasing Parameters

Scenario: You use the Visual Studio ASP.NET ReportViewer and you can’t figure out how to get the parameter values when the page posts back. I couldn’t find a straightforward answer on the discussion list so I thought my findings could come useful.

Solution: Depending on your scenario, you can choose one the following three approaches:

1. You can get the current parameter values after the ReportViewer.PreRender method completes. Use another event that fires after PreRender. Based on my testing, the only events I found out to work are ReportViewer.Unload or Page.Unload, e.g.:

protected void reportViewer_Unload(object sender, EventArgs e) {

    ReportViewer.ReportParameterInfoCollection parameters = reportViewer.ServerReport.GetParameters();

}

2. Subclass the control and override OnPreRender, calling the base method and then obtaining the parameter values.

3. If you are using Visual Studio 2010, the new ReportViewer exposes a new SubmittingParameterValues event for this purpose.

BIDS Integration Story in R2 – The Good, the Bad, and the Ugly

There was a huge discussion thread about the BIDS-Visual Studio integration story in the SQL Server 2008 timeframe where customers complained that BIDS got married with Visual Studio. This required switching both BI and code projects at the same if you want to have a solution with both project types. What has changed in R2?

The good – Report Server projects in BIDS R2 can target either SQL Server 2008 or R2, as explained in this blog. There is no change to Analysis Services projects. If you have a Report Server project with RDL 2008 files, the BIDS Report Designer won’t upgrade the report definitions. However, the moment you add a R2-specific feature, such as the new map region, BIDS will upgrade the report definition to R2 RDL. Therefore, if you want to play with R2 on your machine while the rest of the team is still on 2008 and you are careful, you can install R2 and upgrade BIDS without affecting the rest.

The bad – The R2 Report Server project format has changed to accommodate the new project deployment settings which are explained in the above-mentioned blog. The first time you open a Report Server project in BIDS R2, you will be asked to upgrade the project (not the RDLs). But what if you are forging new frontiers in your company (aka not following corporate standards) and decide to move to R2 while your peers are still on SQL 2008 assuming you use source control and share project files? No worries. BIDS 2008 will automatically strip out the elements in the report project file it doesn’t recognize. But when they save the project the project file will revert to the 2008 format, so you’ll get prompted again to upgrade the next time you load the project in BIDS. To solve this problem, I use a local project file.

The ugly – R2 doesn’t integrate with Visual Studio 2010 as it targets VS 2008 only. And since BI project types are part of BIDS and not Visual Studio, VS 2010 users won’t get the BI project templates either. In other words, you must have BIDS R2 and Visual Studio 2010 SxS. Consequently, you won’t be able to have a solution that spans both Visual Studio 2010 code projects and BI projects. This is like having one TV to watch US channels and a second TV to watch Euro.

It will be nice if one day the SQL Server setup program detects the Visual Studio version installed and ask you which version you want to integrate with. But given that Visual Studio project formats change between releases this may remain a wishful thinking.

Where is RDLOM in R2?

After covering the cool new stuff in R2, it’s time to talk about the uncool. One R2 change that really vexes me is RDLOM. RDLOM, short for Report Definition Language Object Model, lets developers program RDL in object-oriented way as opposed to using XML. RDLOM could save you a lot of effort when you have to meet more advanced requirements that involve pre-processing RDL. For example, in one of my projects we had to implement a wizard that would walk business users through steps to generate an ad-hoc report. Behind the scenes, the wizard would generate the report definition by using a custom RDLOM which we developed since back then Microsoft didn’t have their own. I included a scaled-down version of our implementation in my TechEd 2007 code sample. Although it took us a while to get our RDLOM implemented, it turned out to be invaluable down the road. Since we didn’t have to use XML API to manipulate RDL, it simplified programming RDL. For another project, we used RDLOM to let the customer specify which sections they want to see on the report. Once the selection was made, we would remove the unwanted sections from a report template. The list of real-life scenarios that could take advantage of RDLOM goes on …

Apparently, I wasn’t the only one bugging Microsoft about RDLOM that ships with SSRS and they promised that this will happen at TechED 2007 USA. Sure enough, SSRS 2008 brought us an unsupported version which I blogged about and covered more extensively in my Applied SSRS 2008 book. Although unsupported, the SSRS 2008 RDLOM was very useful. It was implemented in a single assembly (Microsoft.ReportingServices.RdlObjectModel.dll), exposed all RDL elements as objects, and supported serializing and deserializing RDL.

Alas, this will come to pass with R2. Upgrading to R2, you’ll find that the classes in Microsoft.ReportingServices.RdlObjectModel are marked as internal so you cannot use this assembly. Instead, RDLOM got moved into the Microsoft.ReportingServices.Designer.Controls.dll assembly. The good news is that there are rumors that RDLOM may get documented. Now the bad news:

  1. Clearly, R2 RDLOM was not meant to be distributed. While you need to reference explicitly only Microsoft.ReportingServices.Designer.Controls and Microsoft.ReportingServices.RichText assemblies you’ll find that they are dependent on other assemblies and you need to distribute a total of 25 assemblies resulting in a whopping 25 MB of referenced code. One way to avoid shipping so many assemblies to deploy your application in the BIDS or Report Builder 3.0 folders.

030910_0215_WhereisRDLO1

  1. R2 RDLOM will be 32-bit only because BIDS and Report Builder are 32-bit only.
  2. R2 RDLOM doesn’t support serializing the object model. I found a hack that uses .NET reflection to invoke the internal Serialize method of the old RDLOM 2008.

The RDLOM R2 C# console application bundled in this code sample demonstrates how you can use the “new” RDLOM. Besides referencing the new assemblies, you need to use the Load method to deserialize RDL to the object model:

report = Report.Load(fs);

And the above-mentioned hack to serialize it back to disk or stream.

using (FileStream os = new FileStream(ReportPath, FileMode.Create)) {

Assembly assembly = Assembly.GetAssembly(typeof(Report));

object mc = assembly.CreateInstance(“Microsoft.ReportingServices.RdlObjectModel.Serialization.RdlSerializer”);

Type t = mc.GetType();

MethodInfo mi = t.GetMethod(“Serialize”, new Type[] { typeof(Stream), typeof(Report) });

report = (Report)mi.Invoke(mc, new object[] { os, report });

}

Other than that, RDLOM remains virtually unchanged from 2008 excluding that it now supports the R2 enhancements to RDL. What if you don’t need this overhead but you need to make small changes to RDL that don’t necessarily require an object-oriented layer. I’ve found that the easiest way to do this and avoid XML programming is to use LINQ to XML. The LINQ to XML project included in the same sample demonstrates this approach. For example, the following code changes the data source reference of the first data source.

static void Main(string[] args) {

XElement report = XElement.Load(“report.rdl”);

string dns = “{“ + report.GetDefaultNamespace() + “}”;

XElement ds = report.Element(dns + “DataSources”).Element(dns + “DataSource”).Element(dns + “DataSourceReference”);

ds.Value = “/Data Sources/AdventureWorksAS2008”;

report.Save(“report1.rdl”); }

Not happy about the RDLOM story? Join the movement and vote for my suggestion on connect.microsoft.com to have a supported RDLOM.