AzureML – Current State of Affairs

Organizations are showing increased interest in predictive analytics. A large retailer that is using Azure Machine Learning (or AzureML) has approached me to help them automate the process. I can’t help it but compare AzureML with the Analysis Service Data Mining each time I look at it. It looks like both Microsoft and I agree at this point, that what used to be difficult was made simple and vice versa. On the upside, AzureML provides business users with a cloud environment for creating predictive solutions without involvement from IT. AzureML has more features than SSAS Data Mining. Now that the focus is on the cloud, we probably won’t see future investments from Microsoft in SSAS Data Mining (and the Excel data mining add-in for that matter) which is currently limited to nine algorithms. By contrast, not only has AzureML many more algorithms but also it allows users to use R and Python, create workflows, and enjoys a vibrant community that is eager to contribute samples. So, AzureML to predictive analytics is what Power BI is to descriptive analytics. It’s all about democratizing analytics to anyone who needs it.

On the downside, operationalizing AzureML predictive experiments has left some room for improvement. True, it’s very easy to promote your experiment as a web service that can be called in a singleton (one row at the time) or batch manner. The Excel Azure Machine Learning Add-in makes it easy to call the service as singleton or batch if your data lives in Excel and user interaction is acceptable. The Chapter 10 source code of my Applied Power BI book demonstrates how a Power BI model can call a predictive service as singleton and the steps are explained in the book. However, automating this process is much more difficult. Previously, all it was required to periodically retrain an SSAS data mining model was to schedule a job with the SQL Server Agent. By contrast, simply scheduling an AzureML experiment for retraining is currently not an option. You need to integrate with the experiment’s web service either programmatically or by using Azure Data Factory. Similarly, scoring the model in a batch mode (instead of singleton execution), requires calling the batch web service endpoint. Consequently, this requires either custom code (see the Microsoft Azure Machine Learning Samples on CodePlex) or using an Azure Data Factory pipeline. Either of these scenarios would require an intervention from your fellow programmer or Azure Data Factory expert. By contrast, SSAS data mining models can be queried with DMX which is easy but promoting them to the outside world requires custom coding.

Here is a diagram of an Azure Data Factory pipeline that I implemented to demonstrate scoring an AzureML model in a batch execution mode using data from a stored procedure residing in an on-premises SQL Server. In this case, my AzureML predictive experiment returns the probability of a customer to purchase a product although what the experiment does is irrelevant as far automation is concerned.

The high-level steps to automate the process are:

  1. Create a NewCustomers dataset (table) that represents the output of the stored procedure for the new customers that you’ll predict on. Because in this case we’re sourcing data from an on-premises server, we have to install (you guess it) a Data Management Gateway, which you can download from your Azure Data Factory service in the Azure portal. As of now, this gateway is not the same as the Power BI Gateway that is used also by PowerApps, Microsoft Flow and Azure Logic Apps, mind you.
  2. Because the predictive batch service requires an input, use the Copy Activity (blessed be the ADF team for adding it!) to save the results of the stored procedure to Azure Storage as an CSV file with headers.
  3. Use the AzureMLBatchExecution activity to pass the file to the batch endpoint of the predictive experiment.
  4. Save the predicted results from the predictive service as a file in Azure Storage and do whatever is needed with the results.

If the experiment gets the input data from Import Data (formerly known as a Reader), such as from on-premises or Azure SQL table, you can simplify scoring and retraining the model by creating a simple Azure Data Factory pipeline that has only the AzureMLBatchExecution activity. In this case, the activity passes no input to the web service and if the experiment writes to a database, no output is needed either. However, because the pipeline definition requires an output, I still had to create a dummy Azure Storage dataset (PredictedBuyersSQL) although not files will be sent to it.

azuremlreader

Here is what the simplified ADF pipeline may look when the experiment reads and writes to a database.

{
 "name": "PredictBuyersPipelineSQL",
 "properties": {
 "description": "Azure ML model with sql azure reader/writer",
 "activities": [
 {
 "type": "AzureMLBatchExecution",
 "typeProperties": {},
 "outputs": [
 {
 "name": "PredictedBuyersSQL"
 }
 ],
 "policy": {
 "timeout": "00:10:00",
 "concurrency": 1,
 "executionPriorityOrder": "NewestFirst",
 "retry": 1
 },
 "scheduler": {
 "frequency": "Day",
 "interval": 1
 },
 "name": "MLSqlReaderSqlWriterActivity",
 "description": "test",
 "linkedServiceName": "BikeBuyersWebServiceSQL"
 }
 ],
 "start": "2016-09-10T00:00:00Z",
 "end": "2016-09-14T00:00:00Z",
 "isPaused": true,
 "hubName": "adfprologikaonpremdf_hub",
 "pipelineMode": "Scheduled"
 }
}

AzureML will play an increasingly important role with predictive analytics on the rise. I hope the next release of AzureML would simplify operationalizing predictive experiments so that it doesn’t require programming effort or involvement from IT. Now that AzureML supports importing data from on-premises SQL Server and other cloud data sources, it should be simple to automate retraining and scoring models.

Power BI Governance Getting Better

Larger organizations are naturally interested in established procedures for data governance, retention, search and taxonomy. With the rising important of data analytics, such policies are equally important for BI artifacts. For example, a large organization wants to restrict users to access Power BI only from approved devices and/ or on premises. Although it doesn’t match yet the SharePoint data governance features, Power BI is making strides in this direction. The tenant admin can access the Admin Portal page (log in to powerbi.com, then click Settings, Admin Portal) to:

  1. View Power BI usage metrics and utilization. This fulfills a similar role to the Power Pivot Management Dashboard in SharePoint.
  2. Manage users.
  3. Set global tenant permissions, such as the ability to publish content pack to the entire organization, allowing sharing content with external users, publish to web (anonymous access), export data, interact with R scripts, Cortana, Analyze in Excel, templates, create audit logs, and data classification.

Recently, Power BI added two very important governance features:

  • Active Directory conditional policies (requires Azure Active Directory Premium) – Enforces multi-factor authentication that adds an additional-level of the login process, besides using his email and password, the user has to enter a code that Power BI sends to a mobile device. The “Block access when not at work” rule prevents the user from accessing Power BI while not at work.
    conditionalad
  • Auditing– When enabled, this feature generates audit logs when users access Power BI content, export data, or make changes to important settings. Although the tenant admin needs to access the Office 365 Security and Compliance Portal to view the logs, auditing doesn’t require Office 365 subscription.

Prologika Newsletter Fall 2016

Embedding Reports in Custom Applications


embedreportLast week’s seminar on formulating a Power BI enterprise strategy held at the Microsoft office was a great success. Over 50 people witnessed the amazing capabilities of the Power BI platform. As Power BI evolves, we’ll have similar events to bring you up to date. If you couldn’t attend, you can find the slides here. Or, are you looking for more in-depth Power BI training? There is still time to register for my 2-day public Power BI workshop on Sep 14-15 at the Microsoft Office in Atlanta. Reserve your seat today to attend this exclusive event for only $999 and learn practical Power BI knowledge and data analytics skills that you can immediately apply to your job.


Speaking of Power BI, do you need to embed reports in custom apps in order to bring data analytics to your customers? If so, look no further than Power BI Embedded, which Prologika has been using to help ISVs increase the value of their apps by bringing instant insights to their customers. Most applications need some reporting capabilities. Report-enabling custom applications has been traditionally challenging with the Microsoft BI platform. True, Visual Studio includes Windows Forms and ASP.NET ReportViewer controls that make it very easy to embed Reporting Services reports (also known as paginated reports). However, the chances are that you might prefer more interactive reports that can be viewed with any browser and on any device. This is where Power BI Embedded comes in. You have the app. You have the data. Now bring data to life inside your app with Power BI Embedded!

Power BI Embedded

What’s Power BI Embedded?

Power BI Embedded is an Azure cloud service that help developers embed Power BI interactive reports in custom apps for a third party. Notice that I said “third party”. The current licensing model prevents you for using Power BI Embedded to distribute reports inside your organization or re-implement functionality that already exists in Power BI. Fair enough – internal users are already covered by Power BI licenses and Microsoft doesn’t want you to come up with a tool that competes with Power BI. Speaking about licensing, what’s great about the Power BI Embedded pricing model is the recent change that Microsoft made where you’re charged per a report session and not for the number of reports that are rendered or registered customers! When the user opens a report, a session is started for one hour or until the user closes the app. The first 100 sessions/mo are on Microsoft. After that you’re charged $5 for 100 sessions per month. Besides the Power BI Embedded great features, this cost-effective pricing model is why independent software vendors (ISVs) and developers are flocking to Power BI Embedded.

How are users provisioned?

If you recall, Power BI requires each user to sign up. Power BI Pro charges the user $9.99 per month (based on feedback from customers, they don’t pay the sticker price because they either acquire Power BI through the Office 365 E5 plan and/or have a discounted price). If you have an app that potentially can be accessed by thousands of users, the Power BI pricing model is not cost effective. By contrast, Power BI Embedded leaves it up to the custom app to authenticate the user. So there is no provisioning you have do. However, although Power BI Embedded uses the Power BI infrastructure, don’t expect the Power BI reports to show up when you log in to Power BI. In other words, Power BI Embedded and Power BI don’t share datasets and reports. Consequently, at the least for now, Power BI Embedded is limited to embedding reports only. No Q&A, no quick insights, no portal, and no other Power BI goodies. Another limitation is that at least for now Power BI Embedded is limited to report viewing only. Users can’t edit the reports, such as to add or remove fields.

What data sources are supported?

One what to acquire data is to import your data in Power BI Desktop. When you import data, you’re limited of 1 GB of compressed data, which is the same dataset limitation that Power BI currently has. Because of the excellent compression (x5-10 ration), you can still pack a lot of data into a 1 GB dataset. For example, you can create a separate extract for each customer if you have a limited number of customers or you need to provide a high degree of customization for each customer. Another option is to connect live to cloud data sources. Because Power BI Embedded is an Azure cloud service, naturally it works with Azure-resident PaaS data sources, such as Azure SQL Database and Azure SQL Data Warehouse.

How does security work?

Power BI Embedded uses OAuth to authorize your users with Power BI. Your application authenticates the user as it would normally do. If the user has access to reports, your application would login to the Power BI Embedded using a special access key (think of it as a password). When the user requests the report, the application generates a token that Power BI validates to grant access to the report. If you need to support row-level security (RLS), Power BI Embedded get you covered too! You can define your security roles and row filters in Power BI Desktop. Then, your application can pass the user login (it doesn’t have to be on your domain) and what role(s) the user belongs to. The net result is that the user can see only the data the user is authorized to see.

How customizable is Power BI Embedded?

The Power BI Embedded customization story got even better with the introduction of the JavaScript APIs. They allow you to programmatically access the report object model on the client, such as to react to page change events, pass filters, or hide the report filter pane. For example, in the screenshot above, the application has its own page navigation and filter pane that replaces the Microsoft-provided filter pane. This gives the application more control about validating and passing visual, page, or report-level filters.

How do I get started?

Microsoft has provided an excellent ASP MVC sample to get you started. Check also my “Configuring Power BI Embedded” blog for a better configuration experience

MS BI Events in Atlanta

As you’d probably agree, the BI landscape is fast-moving and it might be overwhelming. If you need any help with planning and implementing your next-generation BI solution, don’t hesitate to contact me. As a Microsoft Gold Partner and premier BI firm, you can trust us to help you plan and implement your data analytics projects, and rest assured that you’ll get the best service.

Regards,

sig-1

Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

Upcoming Power BI Events in Atlanta

Time is running out to sign up for these exclusive Power BI Events brought to you by Prologika and Microsoft!

FORMULATING POWER BI ENTERPRISE STRATEGY
Date: Aug 31st , 8:30-12:00
Place: Microsoft Office in Alpharetta
Cost: Free
To learn more and register: http://bit.ly/powerbiseminar
2-DAY APPLIED POWER BI TRAINING CLASS
Date: Sep 14-15, 8:30-5:00
Place: Microsoft Office in Alpharetta
Cost: $999/person
(use coupon POWERBI20160914 to get 10% discount when signing up two or more people)
To learn more and register: http://bit.ly/powerbiworkshop

“Enter Data” Feature in Power BI Desktop

Power BI Desktop has a very useful feature that lets you create tables by manually entering the data. This could be useful in a variety of scenarios, such as entering some reference data, defining KPI goals, creating simple lookup tables, or prototyping some data. If you’re familiar with creating tables in Power Pivot by copying and pasting tabular data, think of Enter Data as the Power BI Desktop equivalent. However, Enter Data is more flexible because it lets you also the edit the data! This makes it more similar to the Power Pivot linked tables that automatically synchronize changes in the Excel source tables.

Creating a new table is straightforward. You click the Enter Data button in the Home ribbon. Don’t confuse this with the New Table button in the Modeling ribbon that allows to create a read-only table from a DAX table-producing expression. While entering the initial data and columns is easy, finding how to make changes is not that obvious. To do so:

  1. Click the Edit Queries button in the Home ribbon to open the Query Editor.
  2. In the Queries pane, select the query that corresponds to the “Enter Data” table.
  3. In the Applied Steps pane, click the gear icon next to the Source step.

082416_0125_EnterDataFe1.png

New USERPRINCIPALNAME DAX Function

Now that Power BI Desktop supports Row-level Security (RLS), modelers have a little predicament. On the desktop, the USERNAME() function returns the user’s domain login (domain\login). However, when the model is deployed to powerbi.com, Username() returns the user principal name, which typically (but not always depending on how your AD is set up) is the user’s email address. To simplify dynamic security based on the user identity, DAX introduces a new USERPRINCIPALNAME() function that can help you secure on a column that has the user principal name. This avoids having to use an OR filter to support both deployment scenarios.

Notice that if your computer is not joined to a domain, both USERNAME() and USERPRINCIPALNAME() return the same thing (domain\login).

Configuring Power BI Embedded

There is a lot of interest surrounding Power BI Embedded from organizations looking for cost-effective ways to embed customer-facing (external) reports in custom apps. And there will be even more interest now that Microsoft reduced the Power BI Embedded cost dramatically by switching from per-render to a session-based pricing model. The best way to get started with Power BI Embedded is with the Microsoft sample app, which is well documented here. Here are a few notes for a better configuration experience:

  1. Register a native Azure app even if your custom app is web-based. That’s because the ProvisionSample console app (inside the sample solution) expects to be configured in Azure as a native app.
  2. Instead of using the Azure Portal, the easiest way to register the app is to go to http://dev.powerbi.com/apps. On the registration page, specify app name (it should typically correspond to your web app name although you can enter anything), and then enter https://login.live.com/oauth20_desktop.srf as a redirect URL because it’s hardcoded in the ProvisionSample app. Once you register the app, the registration page will give you the client id, which you need to enter in the App.config file of the ProvisionSample app, together with your subscription id, azureresourcegroup, and access key (the most important piece) of your Power BI Embedded Service. You can obtain the subscription id, access key, and azureresourcegroup settings from the Power BI Service page as explained here.

081616_1901_Configuring1.png

  1. Apparently, something has changed in Azure but trying to run the ProvisionSample app gave me an error “AADSTS65005: The client application has requested access to resource ‘https://management.core.windows.net/’. This request has failed because the client has not specified this resource in its requiredResourceAccess list“. To fix this, follow these steps:
    1. Go to the Azure Portal (http://portal.azure.com), click More Services, then Active Directory.
    2. Click the Applications tab and then change the Show drop-down to “Applications my company owns” and then click the checkmark next to it.
    3. Click the application you just registered (PowerBIEmbeddedDemo in my case). In the application page, click Configure.
    4. Click the “Add application” button and add the “Windows Azure Service Management API” application. Expand the Delegated Permissions drop-down and check the only permission there. You need to delegate the necessary permissions to the Power BI Service.

      081616_1901_Configuring2.png

Now you should be able to run the ProvisionSample console app successfully. If you still have issues, verify the configuration settings in the App.config file.

2-Day Applied Power BI Workshop – Atlanta

Are you looking for an agile self-service platform that doesn’t require reporting and query skills to get basic analytics done without reliance on IT?  Or, perhaps you’ve heard or evaluating Power BI but not sure where to start or how to take the most out of it? If so, this workshop is for you. A year ago Microsoft unveiled the new Power BI platform consisting of the PowerBI.com cloud service, Power BI Desktop, and Power BI Mobile. Since then, Prologika has helped organizations of all sizes to adopt Power BI. Packed with features, Power BI supports a dizzying variety of features and integration scenarios and it offers plenty to all types of users interested in data analytics: information workers, data analysts, BI pros, and developers.

Reserve your seat today to attend this insightful 2-day workshop for only $999 (use coupon POWERBI20160914 to get 10% discount when signing up two or more people) at the Microsoft Office in Alpharetta, when Teo Lachev (CEO of Prologika, a Power BI Red Carpet Partner) teaches you practical Power BI knowledge and data analytics skills that you can immediately apply to your job. See how Power BI can improve your usability and productivity even further.

Syllabus>>

Key Benefits

  • Understand how Power BI changed the way users (information workers, data analysts, BI pros, and developers) gain and share data insights.
  • Learn how to connect to popular cloud services to derive instant insights, create interactive reports and dashboards, and view them in the browser and on the go.
  • Discover how to integrate and transform data from virtually everywhere and then implement sophisticated self-service models and business calculations.
  • Find how to implement hybrid architectures and strict security requirements by leaving data on premise and deploying reports and dashboards to the cloud.
  • Learn how to share your BI artifacts and collaborate with other teammates.
  • Gain practical skills by creating a self-service model in the lab exercises.
  • Learn Power BI best practices, limitations (every tool has them) and workarounds.
  • Get your questions answered.
  • and much more…

You won’t want to miss this educational and engaging training event! Attend it and get a free paper copy of the book Applied Microsoft Power BI! Please register today as seating is limited.

Audience

  • Information workers
  • Business analysts
  • BI professionals
  • In general, anyone interested in self-service data analytics with Power BI

Prerequisites

Students are encouraged to bring their laptops for the exercises. Detail setup instructions and source files will be sent before the event.

Instructor

Teo Lachev is an internationally-recognized authority on Data Analytics and CEO of Prologika. Teo helps organizations make sense of their most valuable asset: their data. His strategy formulation, trusted advisory and mentoring, design and implementation services empower his clients to apply effectively data analytics in order to understand, improve, and transform their business. Teo has authored and coauthored several books and his latest one is “Applied Microsoft Power BI (Bring your data to life!)” He has been leading the Atlanta Microsoft Business Intelligence group since he founded it in 2010. Microsoft has recognized Teo’s expertise and contributions to the technical community by awarding him the prestigious Microsoft Most Valuable Professional (MVP) for Data Platform award since 2004.

Formulating a Power BI Enterprise Strategy Seminar – Atlanta


A year ago Microsoft unveiled the new Power BI platform consisting of the PowerBI.com cloud service, Power BI Desktop, and Power BI Mobile. Since then, Prologika has helped organizations of all sizes to plan and adopt Power BI. Packed with features, Power BI supports a dizzying variety of features and integration scenarios but it might be difficult to understand how Power BI fits in your data analytics ecosystem.

Join Prologika and Microsoft for a 3-hour free seminar on Wednesday, August 31th, 8:30 AM -12 PM ET, at the Microsoft Office in Alpharetta, when Teo Lachev (CEO of Prologika) and Brian Jackson (Cloud Solution Architect at Microsoft) share practical knowledge and experience to help you to formulate a Power BI enterprise strategy. If you’re considering Power BI but you’re not sure how it fits within your organizational data strategy, this event is for you.

Key Takeaways

  • Understand how Power BI changed the way users (information workers, data analysts, BI pros, and developers) gain and share data insights.
  • Learn 10 areas where Power BI excels compared to other popular BI tools, such as Tableau, Qlik Sense, Sisense, Domo, and others.
  • Plan a data access strategy for:
    • Importing data
    • Connecting live to cloud sources with content packs and solution templates
    • Connecting live to on-premises data sources
    • Implementing real-time dashboards
    • Embedded reporting
    • Taking a deep dive to learn how Prologika designed and implemented a hybrid architecture solution for a Fortune 50 organization and meet security requirements that prevented exporting data to the cloud
  • Join in a discussion about other business use cases and gaps between Power BI and other BI products, and find how to address them. Get your questions answered.

You won’t want to miss this educational and engaging event! Please register today as seating is limited: https://prologika.com/event/formulating-a-power-bi-enterprise-strategy/

Agenda

8:30-9:00 – Networking and introductions

9:00-10:30 – How Power BI empowers businesses like yours

10:30-10:40 – Break

10:40-11:45 – Plan a data access strategy and go “under the hood” of a hybrid architecture case study

11:45-12:00 – Q&A

Presenters

Teo Lachev is an internationally-recognized authority on Data Analytics and CEO of Prologika. Teo helps organizations make sense of their most valuable asset: their data. His strategy formulation, trusted advisory and mentoring, design and implementation services empower his clients to apply effectively data analytics in order to understand, improve, and transform their business. Teo has authored and coauthored several books and his latest one is “Applied Microsoft Power BI (Bring your data to life!)” Prologika is a Microsoft Gold Partner in Data Analytics, demonstrating a “best-in-class” ability and commitment to meet Microsoft customers’ evolving needs and distinguishing itself within the top one percent of Microsoft’s partner ecosystem. Learn more at www.prologika.com.

Brian Jackson is a Microsoft Certified Architect and Cloud Solution Architect at Microsoft. He has deep technical expertise in the SQL Server and Azure platform and served as a subject matter expert for Microsoft’s SQL Server Master and Business Intelligence certification programs. Brian has more than 20 years of experience in solution architecture and software development with a focus on business intelligence, data warehousing and database design. He has strong customer relationship skills with over 15 of years of IT consulting to several Fortune 500 companies and proven success in leading and delivering large scale implementations with globally distributed teams.

Going with the Flow

Currently in preview, Microsoft Flow is a cloud service for creating automation flows without writing code, similar to Zapier’s “zaps” and IFTTT’s “recipes”. How is this useful for BI? Let’s consider an example. Power BI has recently introduced data-driven alerts in Power BI Service (previously alerts were supported on iPhone only). Currently, alerts can be created only on single-card and gauge dashboard tiles that are connected to imported datasets. You can go to the tile properties and click the Manage Alerts (bell) icon to create an alert rule, such as “SalesAmount is above 1,000,000.”

080116_1325_Goingwithth1.png

When the alert rule condition is met, you are notified in the Power BI notification center and by e-mail. But what if you want to broadcast the alert to a large audience? Currently, this feature is not a Power BI native feature. Sure, you can forward the email manually but what if you’re on vacation or the alert is triggered outside working hours but you need to notify certain people immediately? Enters Microsoft Flow that allows you to create simple if-then-else flows. In the screenshot below, I’ve created a trigger that checks my Office 365 email account for emails sent from noreply@powerbi.com and the subject contains “Alert”. If this condition is met, the “yes” action forwards the email to additional people.

080116_1325_Goingwithth2.png

Microsoft flows has a comprehensive list of triggers and actions for integration with many popular services, such as Dynamics CRM, Salesforce, SQL Server, MailChimp, SharePoint, Facebook, GitHub, and many more. Naturally, Microsoft Flow integrates very well with Microsoft cloud and on-prem services. It also has an extensible architecture that allows developers to plug in additional services. Together with PowerApps, Microsoft Flow has a bright future to help you automate your business processes and get actionable insights.