Automating Power BI Desktop Refresh

Power BI Desktop is becoming an increasing popular tool for self-service reporting. But it has a glaring gap. Unlike Excel, it doesn’t currently support an object model for automating tasks. Yet, there are a variety of scenarios that call for task automation, such as refreshing imported data. For example, one customer wanted to show a Power BI Desktop dashboard on a shared monitor that will refresh itself periodically. In another scenario, an ISV wanted to automate the data refresh because Power BI Embedded doesn’t currently have APIs to support a scheduled refresh.

Currently, there is no supported way to refresh Power BI Desktop files automatically. However, you can try the following approaches at your own risk:

  1. Use the Michal Dúbravčík’s PBIXRefresher script. This is a PowerShell script that opens Power BI Desktop and sends a key to the Refresh button.
  2. Shell out to open Power BI Desktop with the file you want to refresh (pbidesktop.exe <filepath to pbix file>). Then, find programmatically the port that the PBI SSAS listens on (see my “Upgrading Power BI Desktop Models to Tabular” blog on this subject). Then, use AMO or the new Tabular Object Model to send a process script command.
  3. Use the commercial Power Update tool, which is capable of refreshing Excel Power Pivot workbooks and Power BI Desktop files.

A Glimpse of Embedding Power BI Reports in SSRS

The first public demo of the highly anticipated Power BI report embedding feature in SSRS 2016 on premises came from Microsoft Ignite. Scroll to the 58 minute in the Ricardo Muti’s “Create a modern enterprise reporting and mobile BI solution with SQL Server 2016” video and enjoy! I expect more details at SQL PASS SUMMIT at the end of this month. Thanks to Dan English for pointing out this video.

100716_0242_AGlimpseofE1.png

Power BI Adds Time Series Forecasting

The September update of Power BI Desktop adds one of the most requested features – time series forecasting on single line charts. You can control the confidence interval and seasonality. To use forecasting, make sure that you add a field of Date data type to the chart axis.

100116_2012_PowerBIAdds1.png

Only when a date field is used, then you’ll see the Forecast section added to the Analytics pane.

100116_2012_PowerBIAdds2.png

Presenting at Atlanta Code Camp 2016

I’m presenting Power BI Embedded at Atlanta Code Camp on Saturday, October 15th.

You have the app. You have the data. What if your app could put the power of analytics everywhere decisions are made and allow your customers to gain insights? Modern apps with data visualizations built-in have the power to inform decisions in context—for any user and on any device. Join me to discover how you can embed data analytics in any app and on one device powered by Azure and Power BI Embedded.

I’ll share my experience in helping customers embed Power BI reports. Learn how to:

  • Create compelling interactive reports
  • Embed easily for faster time to value
  • Deploy quickly and manage with ease
Find the slides in these locations:
1. This site

MVP for 13 Years!

Microsoft awarded me again with the Most Valuable Professional (MVP) Award for Data Platform. This is a prestigious annual award given to selected individuals in recognition for their expertise and contribution to the community. There are 387 Data Platform MVPs worldwide and 92 of them are in the United States. his makes it 13 consecutive years for me as Data Platform MVP!

mvp_logo_horizontal_secondary_blue286_rgb_300ppi

Power BI Nested Hierarchical Labels

In the latest Power BI Desktop update, Power BI introduced a new drill-down option. Previously, after enabling drill-down on a chart and clicking the double-arrow icon, the chart will drill down to the next field in the Axis area (or to the next level of a hierarchy if a hierarchy is added to the Axis area). A student attending my Power BI class asked me this week how to avoid aggregation across all years if the user drills down from Year to the Month level. A few days ago, the only option was to introduce another Month field that has the Year-Month combination, e.g. 2005 July, 2005 August, and so on. Now, with the new drill down style, the chart will nest the hierarchical levels so you don’t have to add a new field. This behavior is similar to how Excel show categories nested in PivotChart.

Atlanta MS BI Group Meeting on September 26th

MS BI fans, join me for the next must-attend Atlanta MS BI and Power BI Group meeting on September 26th at 6:30 PM. The Microsoft Ignite conference takes place this year in Atlanta, starting on Sep 26th. Kamal Hathi (General Manager for Power BI at Microsoft) and Pratap Ladhani (Sr. Program Manager at Microsoft) will be in town for Ignite and we will present at our meeting! This is a must attend session for everyone using or planning to use Power BI. Join us and learn why Power BI enjoys such a tremendous success and where Microsoft is planning to take it. Enjoy catering by Subway, thanks for Microsoft sponsoring this event.

Presentation:Power BI at one year – a look back and a look forward 
Level: Beginner
Date:Monday, September 26th, 2016
Time6:30 – 8:30 PM ET
Place:South Terraces Building (Auditorium Room)
115 Perimeter Center Place
Atlanta, GA 30346
Overview:This is a must attend session for everyone using or planning to use Power BI. With weekly and monthly releases, there are a lot of fantastic new features to explore in Power BI. In this session we will cover an overview of what has been released in Power BI across PowerBI.com, Power BI Desktop, mobile apps and APIs. You’ll also discover what’s new and exciting in Power BI. In this session, we’ll also introduce PowerApps – a new product from the Power BI engineering team which enables non-developers to create & use custom applications & forms across all platforms (iOS, Android, Windows & Web).
Speaker:Kamal Hathi is a General Manager for Power BI at Microsoft and a co-founder of Microsoft’s multi-billion dollar Analytics and Business Intelligence products. Kamal has delivered Power BI to market, and was responsible for its end-to-end strategy, design and delivery. Kamal is currently responsible for product growth and success of the Microsoft Business Application Platform.

Pratap Ladhani is a Sr. Program Manager at Microsoft focusing on Customer & Partner success for PowerApps and Microsoft Flow. He has been working with Microsoft for more than 13 years and has spoken earlier at Build, TechEd, Convergence & WPC conferences.
We expect other speakers from Microsoft to join us too.

Sponsor:Microsoft

New Custom Visual Developer Toolset

One of the most prominent Power BI benefits is its extensible architecture that allows developers to integrate Power BI with custom apps and extend its capabilities, such by creating custom visuals. Having contributed one of the first custom visuals, the Sparkline custom visual, I can tell from experience that Microsoft does its part to ensure that the submitted visuals meets quality, security, and functionality best practices.

As custom visual developers have probably noticed, the Power BI Dev Tools is deprecated in favor of the new Custom Visual CLI Developer Toolset. The announcement page enumerates the main benefits of the new toolset. Personally, I like that Microsoft has decoupled the tool from dependencies to the “visual framework” that the original Power BI visuals use. This allows developers to use whatever dev tool they like, such as Visual Studio Code or Visual Studio, to code custom visuals. I also like the better integration with Power BI Service for auto-detecting code changes and showing what data has been passed to the visual.

There will be some pains when migrating existing visuals to the new tool. As of this time, there are also missing features so don’t rush yet to the new tool:

  1. Tooltips are not yet supported.
  2. Value formatters, such as to format values shown in tooltips, are not supported so you have to create your own, use third-party libraries, or copy code from this project.
  3. The D3 definitions that the tool installs are different from the ones old tool and the Visual Studio framework used. Moreover, Microsoft indicates that they will stop providing the D3 definitions. Consequently, some D3 features I used resulted in compiled errors so I had to use the old types.
  4. For some reason, checking the “Enable developer visual for testing” checkbox in Power BI Settings didn’t “stick” for me in Chrome so I had to use Edge.

The new Custom Visual CLI Developer Toolset will improve the developer experience for coding custom visuals. Download the migrated Sparkline code that works with the new toolset to understand the changes.

AzureML – Current State of Affairs

Organizations are showing increased interest in predictive analytics. A large retailer that is using Azure Machine Learning (or AzureML) has approached me to help them automate the process. I can’t help it but compare AzureML with the Analysis Service Data Mining each time I look at it. It looks like both Microsoft and I agree at this point, that what used to be difficult was made simple and vice versa. On the upside, AzureML provides business users with a cloud environment for creating predictive solutions without involvement from IT. AzureML has more features than SSAS Data Mining. Now that the focus is on the cloud, we probably won’t see future investments from Microsoft in SSAS Data Mining (and the Excel data mining add-in for that matter) which is currently limited to nine algorithms. By contrast, not only has AzureML many more algorithms but also it allows users to use R and Python, create workflows, and enjoys a vibrant community that is eager to contribute samples. So, AzureML to predictive analytics is what Power BI is to descriptive analytics. It’s all about democratizing analytics to anyone who needs it.

On the downside, operationalizing AzureML predictive experiments has left some room for improvement. True, it’s very easy to promote your experiment as a web service that can be called in a singleton (one row at the time) or batch manner. The Excel Azure Machine Learning Add-in makes it easy to call the service as singleton or batch if your data lives in Excel and user interaction is acceptable. The Chapter 10 source code of my Applied Power BI book demonstrates how a Power BI model can call a predictive service as singleton and the steps are explained in the book. However, automating this process is much more difficult. Previously, all it was required to periodically retrain an SSAS data mining model was to schedule a job with the SQL Server Agent. By contrast, simply scheduling an AzureML experiment for retraining is currently not an option. You need to integrate with the experiment’s web service either programmatically or by using Azure Data Factory. Similarly, scoring the model in a batch mode (instead of singleton execution), requires calling the batch web service endpoint. Consequently, this requires either custom code (see the Microsoft Azure Machine Learning Samples on CodePlex) or using an Azure Data Factory pipeline. Either of these scenarios would require an intervention from your fellow programmer or Azure Data Factory expert. By contrast, SSAS data mining models can be queried with DMX which is easy but promoting them to the outside world requires custom coding.

Here is a diagram of an Azure Data Factory pipeline that I implemented to demonstrate scoring an AzureML model in a batch execution mode using data from a stored procedure residing in an on-premises SQL Server. In this case, my AzureML predictive experiment returns the probability of a customer to purchase a product although what the experiment does is irrelevant as far automation is concerned.

The high-level steps to automate the process are:

  1. Create a NewCustomers dataset (table) that represents the output of the stored procedure for the new customers that you’ll predict on. Because in this case we’re sourcing data from an on-premises server, we have to install (you guess it) a Data Management Gateway, which you can download from your Azure Data Factory service in the Azure portal. As of now, this gateway is not the same as the Power BI Gateway that is used also by PowerApps, Microsoft Flow and Azure Logic Apps, mind you.
  2. Because the predictive batch service requires an input, use the Copy Activity (blessed be the ADF team for adding it!) to save the results of the stored procedure to Azure Storage as an CSV file with headers.
  3. Use the AzureMLBatchExecution activity to pass the file to the batch endpoint of the predictive experiment.
  4. Save the predicted results from the predictive service as a file in Azure Storage and do whatever is needed with the results.

If the experiment gets the input data from Import Data (formerly known as a Reader), such as from on-premises or Azure SQL table, you can simplify scoring and retraining the model by creating a simple Azure Data Factory pipeline that has only the AzureMLBatchExecution activity. In this case, the activity passes no input to the web service and if the experiment writes to a database, no output is needed either. However, because the pipeline definition requires an output, I still had to create a dummy Azure Storage dataset (PredictedBuyersSQL) although not files will be sent to it.

azuremlreader

Here is what the simplified ADF pipeline may look when the experiment reads and writes to a database.

{
 "name": "PredictBuyersPipelineSQL",
 "properties": {
 "description": "Azure ML model with sql azure reader/writer",
 "activities": [
 {
 "type": "AzureMLBatchExecution",
 "typeProperties": {},
 "outputs": [
 {
 "name": "PredictedBuyersSQL"
 }
 ],
 "policy": {
 "timeout": "00:10:00",
 "concurrency": 1,
 "executionPriorityOrder": "NewestFirst",
 "retry": 1
 },
 "scheduler": {
 "frequency": "Day",
 "interval": 1
 },
 "name": "MLSqlReaderSqlWriterActivity",
 "description": "test",
 "linkedServiceName": "BikeBuyersWebServiceSQL"
 }
 ],
 "start": "2016-09-10T00:00:00Z",
 "end": "2016-09-14T00:00:00Z",
 "isPaused": true,
 "hubName": "adfprologikaonpremdf_hub",
 "pipelineMode": "Scheduled"
 }
}

AzureML will play an increasingly important role with predictive analytics on the rise. I hope the next release of AzureML would simplify operationalizing predictive experiments so that it doesn’t require programming effort or involvement from IT. Now that AzureML supports importing data from on-premises SQL Server and other cloud data sources, it should be simple to automate retraining and scoring models.

Power BI Governance Getting Better

Larger organizations are naturally interested in established procedures for data governance, retention, search and taxonomy. With the rising important of data analytics, such policies are equally important for BI artifacts. For example, a large organization wants to restrict users to access Power BI only from approved devices and/ or on premises. Although it doesn’t match yet the SharePoint data governance features, Power BI is making strides in this direction. The tenant admin can access the Admin Portal page (log in to powerbi.com, then click Settings, Admin Portal) to:

  1. View Power BI usage metrics and utilization. This fulfills a similar role to the Power Pivot Management Dashboard in SharePoint.
  2. Manage users.
  3. Set global tenant permissions, such as the ability to publish content pack to the entire organization, allowing sharing content with external users, publish to web (anonymous access), export data, interact with R scripts, Cortana, Analyze in Excel, templates, create audit logs, and data classification.

Recently, Power BI added two very important governance features:

  • Active Directory conditional policies (requires Azure Active Directory Premium) – Enforces multi-factor authentication that adds an additional-level of the login process, besides using his email and password, the user has to enter a code that Power BI sends to a mobile device. The “Block access when not at work” rule prevents the user from accessing Power BI while not at work.
    conditionalad
  • Auditing– When enabled, this feature generates audit logs when users access Power BI content, export data, or make changes to important settings. Although the tenant admin needs to access the Office 365 Security and Compliance Portal to view the logs, auditing doesn’t require Office 365 subscription.