Posts

Solving iPad Issues with Power BI Secure Embed

Scenario: You have an intranet web portal and use the Power BI secure embed feature (from the report menu, File->Embed report->Website or portal) to embed a report. However, the report doesn’t render on iPad and iPhone devices. Instead, the user is perpetually asked to authenticate  with Power BI.

Solution: Apple has started preventing cross-site cookies to tighten up security. To resolve this horrible issue, each user must turn off this feature. On their iPad or iPhone device, go to Settings > your browser app, such as Safari or Chrome, and set Present Cross-Site Tracking to Off (assuming you are using Safari to render the web page).

Report-enable Internal Portals with Power BI Reports

Expanding on my previous blog on this subject, there are three options to report-enable your Intranet portals with Power BI reports:

  • Report link – This is the link to the report that you obtain from the browser address bar. If you want to report in full screen viewing mode (hiding the Power BI chrome), you can append the chromeless=1 query parameter to the URL, such as https://app.powerbi.com/groups/689c8ae3-0e22-44d1-9803-b6afdba4e583/reports/b60eaab4-9a85-47c3-8cde-e3a17e8f3dae/ReportSection?chromeless=1. If the “Display report pages as tabs along the bottom of the report” report setting is disabled, the report will be rendered with a few buttons at the bottom. Clicking “Go back” or exiting the full-screen view, will “restore” the Power BI portal chrome, allowing the user to gain access to the report action bar, such as to get report insights. This is the only option to have access to all commands in the report action bar. In addition, the users can store the link in the browser favorites and potentially customize it. However, a report link doesn’t allow the report to be “embedded” in a page. Attempting to put the link on an iframe won’t work because of content security policy (specifically Microsoft sets the Content-Security-Policy tag to frame-ancestors ‘self’ in the response payload, I guess to prevent organizations from circumventing the Power BI portal).
  • Embed for Website or Portal – This is the iframe code you get from the report’s File -> Embed for Website or Portal. This is your easiest option to provide an embedded experience for your coworkers, but it doesn’t currently support the report action bar. There isn’t a way to customize the report link.
  • Power BI Embedded REST API – This option includes server-side and client-side APIs to let you customize the embedding experience, such as to replace the default Filter pane with your own implementation. This is the option most organizations take to provide embedded experience for external customers, but it could be used if you want full control over embedding reports internally. Microsoft added recently an option to show the action bar, however, only a subset of commands are available. This is the only option for single sign-on (SSO) experience, but it’s the most difficult to implement, as it requires extending your app with custom code.

This table summarizes the three options discussed in this blog.

FeatureReport linkEmbed for Website or PortalPower BI Embedded REST API
Embedded content experienceNoYesYes
Single sign-onNoNoYes
Action barNoNoPartially
ShortcutYesNoNo
Developer APINoNoYes
Integration effortLowLowMedium to High

Embedding Power BI Reports in Internal Portals

This question pops up over and over. What’s the easiest way to embed a Power BI or paginated report hosted in Power BI Service in an internal portal, such as a custom site developed by your team or on-prem SharePoint Server, so internal users can see all reports in one place? Use the “Embed report” feature that will give you a link or iframe code that you can readily add to the portal without any coding (notice that there is a SharePoint Online option because SharePoint Online has a special webpart for this purpose).

Will you get a single sign-on (SSO) so that the interactive user credentials automatically flow to Power BI? Nope. To the best of my knowledge, Power BI doesn’t support single sign-on even if your organization has extended its active directory to Azure, but let me know if you have found a workaround. The first time the user opens a Power BI report in the browser session, the user will get prompted to enter credentials. Then the user won’t be asked to authenticate when viewing subsequent reports in the browser session because the authentication token will be cashed in the browser session. The browser can also mitigate the issue by saving the user credentials until the password changes.

What if you want to avoid the prompt under any circumstance? Write code and use Power BI Embedded. Down the custom code rabbit hole we go… The upside is that you’ll get more control over the embedded reports thanks to the Power BI JavaScript library, such as if you want to replace the Filter Pane with your filtering mechanism.

Atlanta MS BI and Power BI Group Meeting on October 3rd (Auto-provision embedded report delivery for your customers)

Please join us online for the next Atlanta MS BI and Power BI Group meeting on Monday, October 3rd, at 6:30 PM ET.  Tom Huguelet (Lucient) will present a real-world case study and the challenges involved in bringing a Power BI application from initial Excel-based POC to an ISV platform that auto-provisions for new clients, using Power BI Embedded Azure Premium Capacity.  For more details and sign up, visit our group page.

Presentation:Auto-provision embedded report delivery for your customers
Date:October 3rd
Time:6:30 – 8:30 PM ET
Place:Click here to join the meeting
Overview:In this session we’ll look at a real-world case study and the challenges involved in bringing a Power BI application from initial Excel-based POC to an ISV platform that auto-provisions for new clients, using Power BI Embedded Azure Premium Capacity.

We will cover:

•            Initial Business Case and Project Team

•            Testing Embedded in a Sandbox Environment (Authentication)

•            Enabling User-Customized Experiences in Embedded

•            Using REST APIs for automatic Provisioning

•            Lessons learned and challenges encountered

Speaker:Tom Huguelet has been engaged in using Business Intelligence, and Dashboards & Analytics to enable Corporate Performance Management since 1999. While coming from a technological background, Tom has always focused on the people and process aspects of how Business Intelligence can create positive changes within an organization. In his role as a Practice Director, he leads and mentors technical teams as well as advises business stakeholders on how to build the cross-disciplinary teams necessary to be successful with Business Intelligence. Tom holds several Microsoft certifications including Microsoft Certified Trainer, is a frequent speaker at industry events, has co-authored 3 books on SQL Server and Data Warehousing, and helped start a BI Roundtable group in Chicago.
Prototypes without PizzaPower BI Latest

PowerBILogo

Power BI Embedded, Service Principals, and AAS

In my previous post “Power BI Embedded, Service Principals, and SSAS“, I discussed how you can integrate Power BI Embedded (App Owns Data) configured for service principal authentication with SSAS to pass the effective user identity. One important observation is that you can use this approach with both internal and external users. For internal users, the Power BI gateway (running under an account that has admin rights to the SSAS instance) passes the effective user identity under the EffectiveUserName connection string setting. For internal users, the effective user identity maps to the user UPN, such as john.doe@prologika.com, so that AAS can map it to the corresponding AAS account. For external users, you can configure the gateway for CustomData, and pass whatever you want as an effective user identity.

Suppose that one day you migrate your code to Azure Analysis Services (AAS)? AAS. Will it work? Unfortunately, not. Since there is no gateway between Power BI and AAS, there isn’t a layer to authenticate using a trusted account. So, the Power BI team has decided to go only with CustomData instead and Power BI Embedded supports a special parameter which only works for AAS . Although the documentation doesn’t emphasize this difference, it has an important paragraph “The only way to have dynamic RLS (which uses dynamic values for filter evaluation) in Azure Analysis Services, is using the CUSTOMDATA() function”. Let’s break this down.

  1. You must use the Object ID of the service principal account when you construct your effective identity. See my previous blog of how to obtain that identifier. Attempting to pass anything other than the Object ID will result in a Forbidden error when the code attempts to obtain the embed token by calling client.Reports.GenerateTokenInGroup().
    var identity = new EffectiveIdentity(“<Object ID GUID>”, new List<string> { report.DatasetId }, customData:“someuser@acme.com”);
  2. You must use the customData parameter to pass whatever identifier your AAS row-level security will use to authorize the interactive user. DAX can obtain this identifier from the CUSTOMDATA() function.
  3. You must add the service principal Object ID to each AAS security role in which the user needs to be evaluated.

Your Power BI Embedded App Owns Data implementation will need different code for SSAS and AAS. The AAS version relies on CUSTOMDATA for handling row-level security.

Although this implementation path is fundamentally different from SSAS, it will work with external users that are not part of your Azure AD. But users registered in Azure AD cannot be just added to AAS roles. This will be pointless because you won’t be able to pass their identity under EffectiveUserName and AAS won’t be able to evaluate them as AAD users. So, both internal and external users must go somehow through CUSTOMDATA.

Power BI Embedded, Service Principals, and SSAS

Power BI Embedded supports two ways that your custom app can authenticate to Power BI using a trusted account: master account (the original option) and more recently service principle. Service principal authentication is preferred because it doesn’t require storing and using credentials of a Power BI Pro account. Configuring the service principal and embedding reports with imported data is easy. Not so much with embedding reports connected to on-prem Analysis Services models, mainly because of documentation gaps. Here are some notes you might find useful that I harvested from a recent engagement.

  1. Unlike what the documentation states that only SSAS models with RLS requires it, you must grant the service principal ReadOverrideEffectiveIdentity permission. Otherwise, the service principal can’t delegate the user identity to the gateway. So, the gateway admin must call the Gateways – Add Datasource User API.

    TIP: Don’t write code but use the documentation page to call the API (isn’t this nice!). Construct the body to look like this:
    {

“identifier”: “3d9b93c6-7b6d-4801-a491-1738910904fd“,

“datasourceAccessRight”: “ReadOverrideEffectiveIdentity”,

“principalType”: “App”

}

What the API page doesn’t tell you is what you need to use for the identifier in the request body. Your first attempt might be to use ApplicationID, but you’ll get greeted with error ” DMTS_PrincipalsAreInvalidError”. Instead, you must use the object id (in Azure portal, go to the app registration, and then click the app name in the “Managed Application in local directory” property).

Then, on the next page, copy Object ID.

  1. Once the permission is granted, your custom app must construct an effective identity, such as by using this code. Notice you need to get to the report dataset to check how it’s configured. The GetDatasetByIdInGroup API requires the workspace identifier. The code assumes that the dataset is in the report’s workspace. If you need to embed reports connected to shared datasets which reside in other workspaces, you have a problem because the workspace id is not available in the report object. The best way your app can address this might be to maintain a reference map for these reports (reportId, workspaceId). I hope Microsoft changes the GetDatasetByIdInGroup to not require a workspace (not sure why is needed at all with guids).
  2. var dataset = client.Datasets.GetDatasetByIdInGroup(WorkspaceId, report.DatasetId);
  3. var IsEffectiveIdentityRequired = dataset.IsEffectiveIdentityRequired;
  4. var IsEffectiveIdentityRolesRequired = dataset.IsEffectiveIdentityRolesRequired;
  5. GenerateTokenRequest generateTokenRequestParameters = null;
  6. if (!(bool)IsEffectiveIdentityRequired)
  7. // reports with imported data that don’t require RLS
  8. {
  9. // Generate Embed Token for reports without effective identities.
  10. generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: “view”);
  11. }
  12. else
  13. // reports connecting to RLS datasets and Analysis Services
  14. {
  15. var identity = new EffectiveIdentity(“<replace with the interactive user email, e.g. john@acme.com>”,
  16.     new List<string> { report.DatasetId });
  17. var roles = “”; // if you want the user to evaluated as a member of a certain RLS role, replace with a comma-delimited list of roles
  18. if (!string.IsNullOrWhiteSpace(roles))
  19. {
  20. var rolesList = new List<string>();
  21. rolesList.AddRange(roles.Split(‘,’));
  22. identity.Roles = rolesList;
  23. }
  24. // Generate Embed Token with effective identities.
  25. generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: “view”,
  26.     identities: new List<EffectiveIdentity> { identity });
  27. }

    Notice that in line 16, your app must pass a valid Windows login to Analysis Services because behind the scenes the gateway will be append EffectiveUserName to the connection string.

Embedding Power BI Reports in JavaScript

A recent engagement brought an interesting dilemma. An ISV wanted to embed Power BI reports in a SharePoint Online portal accessible by their customers. The app handles user authentication and authorization by following the “App owns data” pattern. However, apparently inspired by the Stalin’s “No man, no problem” motto, SharePoint has decided to throw away custom server-side code, so we couldn’t add an ASP.NET page to obtain the embed token. This left us with two choices:

  • A pure JavaScript implementation – The main advantage of this option is that we could simply write a SharePoint client web part that handles the entire Power BI Embedded integration in JavaScript only. The major downside is that any JavaScript code is not secure, and this approach has a security vulnerability because you must store passwords in client-side code.
  • Implement a separate app – You can implement a separate service, such as an Azure Function app to handle tokens. Eventually, we ended up going this path because it’s more secure.

I attach the Node.js code if you need a pure JavaScript solution. The code uses a service principal authentication (follow the steps in the “Service principal with Power BI” document to configure it).

Important: If you decide to follow this path, please do your best to obfuscate the client secret because if the hacker gets access to the client-side code, the hacker can obtain that secret.

One could argue that if a hacker gets access to the user’s computer, you have a much bigger issue than going through JavaScript files to get the secret, but there is a security vulnerability, nonetheless.

Editing and Creating Reports in Power BI Embedded

I was doing a Power BI Embedded demo for a customer and lo and behold, being an ever-changing cloud technology, Power BI Embedded surprised me in a great way. When you install and run the Embedded Sample, it adds a nice “Embed and play with current report using Embedded Live Sample” link to the Page Navigation section.

This brings you to the Power BI Embedded live sample with your Power BI Embedded report loaded. You can access the Live Sample from here if you don’t want to configure and install Power BI Embedded sample. In this case, it uses sample reports. Not only does the sample show you Power BI Embedded in action but it also shows you the relevant code.

The surprise is that Power BI Embedded now supports Edit and Create modes!

Similar to Power BI Services, users can now edit existing reports and create their own reports from scratch. For more details of how to do this programmatically, read the documentation.

pbiembedded

Prologika Newsletter Winter 2016

Designing an Operational Data Store (ODS)


odsI hope you’re enjoying the holiday season. I wish you all the best in 2017! The subject of this newsletter came from a Planning and Strategy assessment for a large organization. Before I get to it and speaking of planning, don’t forget to use your Microsoft planning days as they will expire at the end of your fiscal year. This is free money that Microsoft gives you to engage Microsoft Gold partners, such as Prologika, to help you plan your SQL Server and BI initiatives. Learn how the process works here.


Just like a data warehouse, Operational Data Store (ODS) can mean different things for different people. Do you remember the time when ODS and DW were conflicting methodologies and each one claimed to be superior than the other? Since then the scholars buried the hatchet and reached a consensus that you need both. I agree.

To me, ODS is nothing more than a staging database on steroids that sits between the source systems and DW in the BI architectural stack.

What’s Operational Data Store?

According to Wikipedia “an operational data store (or “ODS”) is a database designed to integrate data from multiple sources for additional operations on the data…The general purpose of an ODS is to integrate data from disparate source systems in a single structure, using data integration technologies like data virtualization, data federation, or extract, transform, and load. This will allow operational access to the data for operational reporting, master data or reference data management. An ODS is not a replacement or substitute for a data warehouse but in turn could become a source.”

OK, this is a good starting point. See also the “Operational Data Source (ODS) Defined” blog by James Serra. But how do you design an ODS? In general, I’ve seen two implementation patterns but the design approach you take would really depends on how you plan to use the data in the ODS and what downstream systems would need that data.

One to One Pull

ODS is typically implemented as 1:1 data pull from the source systems, where ETL stages all source tables required for operational reporting and downstream systems, such loading the data warehouse. ETL typically runs daily but it could run more often to meet low-latency reporting needs.  The ETL process is typically just Extract and Load (it doesn’t do any transformations), except for keeping a history of changes (more on this in a moment). This results in a highly normalized schema that’s the same is the original source schema. Then when data is loaded in DW, it’s denormalized to conform to the star schema. Let’s summarize the pros and cons of the One:one Data Pull design pattern.

 ProsCons
Table schemaHighly normalized and identical to the source systemThe number of tables increase
Operational reportingUsers can query the source data as it’s stored in the original source. This offloads reporting from the source systemsNo consolidated reporting if multiple source systems process same information, e.g. multiple systems to process claims
Changes to source schemaSource schema is preservedAdditional ETL is required to transform to star schema
ETLExtraction and load from source systems (no transformations)As source systems change, ETL needs to change

Common Format

This design is preferred when the same business data is sourced from multiple source systems, such as when the source systems might change or be replaced over time. For example, an insurance company might have several systems to process claims. Instead of ending up with three sets of tables (one for each source system), the ODS schema is standardized and the feeds from the source systems are loaded into a shared table. For example, a common Claim table stores claim “feeds” from the three systems. As long as the source endpoint (table, view, or stored procedure) returns the data according to an agreed “contract” for the feed, ODS is abstracted from source system changes. This design is much less normalized. In fact, for the most part it should mimic the DW schema so that DW tables can piggy back on the ODS tables with no or minimum ETL.

 ProsCons
Table schemaDenormalized and suitable for reportingThe original schema is lost
Operational reportingRelevant information is consolidated and stored in one tableSchema is denormalized and reports might not reflect how the data is stored in the source systems
Schema changes to source systemsAs long as the source endpoints adhere to the contract, ODS is abstracted from schema changesA design contract needs to be prepared and sources systems need to provide the data in the agreed format
ETLLess, or even no ETL to transform data from ODS to DWETL needs to follow the contract specification so upfront design effort is required

Further Recommendations

Despite which design pattern you choose, here are some additional recommendations to take the most of your ODS:

  • Store data at its most atomic level – No aggregations and summaries. Your DW would need the data at its lowest level anyway.
  • Keep all the historical data or as long as required by your retention policy – This is great for auditing and allows you to reload the DW from ODS since it’s unlikely that source systems will keep historical data.
  • Apply minimum to no ETL transformations in ODS – You would want the staged data to keep the same parity with the source data so that you can apply data quality and auditing checks.
  • Avoid business calculations in ODS – Business calculations, such as YTD, QTD, variances, etc., have no place in ODS. They should be defined in the semantic layer, e.g. Analysis Services model. If you attempt to do so in ODS, it will surely impact performance, forcing to you to pre-aggregate data. The only permissible type of reporting in ODS is operational reporting, such as to produce the same reports as the original systems (without giving users access to the source) or to validate that the DW results match the source systems.
  • Maintain column changes to most columns – I kept the best for last. Treat most columns as Type 2 so that you now when a given row was changed in the source. This is great for auditing.

Here is a hypothetical Policy table that keeps Type 2 changes. In this example, the policy rate has changed on 5/8/2010. If you follow this design, you don’t have to maintain Type 2 in your DW (if you follow the Common Format pattern) and you don’t have to pick which columns are Type 2 (all of them are). It might be extreme but it’s good for auditing. Tip: use SQL Server 2016 temporal tables to simplify Type 2 date tracking.

RowStartDateRowEndDateSourceIDRowIsCurrentRowIsDeletedETLExecutionIDPolicyKeyPolicyIDPremiumRate
5/2/20105/8/20101000BB76521-AA63-…14969610.45
5/9/201012/31/9999110CD348258-42ED-..24969610.50

MS BI Events in Atlanta

As you’d probably agree, the BI landscape is fast-moving and it might be overwhelming. If you need any help with planning and implementing your next-generation BI solution, don’t hesitate to contact me. As a Microsoft Gold Partner and premier BI firm, you can trust us to help you plan and implement your data analytics projects, and rest assured that you’ll get the best service.

Regards,

sig-1

Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

Automating Power BI Desktop Refresh

Power BI Desktop is becoming an increasing popular tool for self-service reporting. But it has a glaring gap. Unlike Excel, it doesn’t currently support an object model for automating tasks. Yet, there are a variety of scenarios that call for task automation, such as refreshing imported data. For example, one customer wanted to show a Power BI Desktop dashboard on a shared monitor that will refresh itself periodically. In another scenario, an ISV wanted to automate the data refresh because Power BI Embedded doesn’t currently have APIs to support a scheduled refresh.

Currently, there is no supported way to refresh Power BI Desktop files automatically. However, you can try the following approaches at your own risk:

  1. Use the Michal Dúbravčík’s PBIXRefresher script. This is a PowerShell script that opens Power BI Desktop and sends a key to the Refresh button.
  2. Shell out to open Power BI Desktop with the file you want to refresh (pbidesktop.exe <filepath to pbix file>). Then, find programmatically the port that the PBI SSAS listens on (see my “Upgrading Power BI Desktop Models to Tabular” blog on this subject). Then, use AMO or the new Tabular Object Model to send a process script command.
  3. Use the commercial Power Update tool, which is capable of refreshing Excel Power Pivot workbooks and Power BI Desktop files.

Downloads

Embed Interactive Reports in Your Apps

Slides from my presentation for Atlanta Code Camp 2016 on Power BI Embedded.

You have the app. You have the data. What if your app could put the power of analytics everywhere decisions are made and allow your customers to gain insights? Modern apps with data visualizations built-in have the power to inform decisions in context—for any user and on any device. Thanks to Power BI Embedded, you can now embed data analytics in any app and on one device.