Prologika Newsletter Winter 2019

Happy Holidays! I hope you’re enjoying this special time of the year. A few months ago, I did an assessment for a large company that was advised by an undisclosed source that they should use their Dynamics Financials and Operations (F&O) system as a data warehouse. Recently, I came across a similar wish but this time to use SAP as EDW. Can we do this? But before I give you my opinion, I’m excited to announce the availability of the fifth edition of my “Applied Microsoft Power BI” book – the only Power BI book that it’s updated every year to keep it up with the ever-changing world of Power BI and the Microsoft Data Platform! The book is making slowly its way to the retailers and it should be available on Amazon in the first days of 2020.

Operational Reporting

I understand that everyone wants to do more with less and shortcuts are tempting. But ERP systems are systems of record, just like any other data source. True, they could own most of the core data that you need for analytics. But that data is normalized and stored in a format that’s not conducive for analytics. To make things worse, Dynamics doesn’t even give you direct access to its SQL Server database on your production instance. You must go through REST APIs or export data to gain access to it. And to add new tables, you must create entities in Visual Studio! Still want to build a data warehouse in Dynamics?

ERP systems typically have some reporting features, but these features typically deliver only operational reporting. Operational reporting has a narrow view concerned with “now”, such as a report that shows customers with outstanding balances as of today. For example, Dynamics comes with standard SSRS reports. You could also enable analytical workspaces that deliver reports via Power BI Embedded. These reports, however, are operational reports. By contrast, BI is mostly concerned with historical and trend analysis.

BI Axioms

In math, axioms are statements that are assumed to be correct without a proof. We need BI axioms and the list can start like this:

  • Every mid to large company shall have a centralized data repository for consolidating trusted data that is optimized for reporting. The necessity for such a repository is in a direct correlation with the number of the data sources that must be integrated (that number will increase over time) and the complexity of the data transformation. The centralized data repository is commonly referred to as a data warehouse.
  • Dimensional modeling shall be the methodology to design the data warehouse schema. I know it’s tempting to declare your ODS as a data warehouse, but highly normalized schemas are not suitable for reporting.
  • If you’re after a single version of the truth, you shall have an organizational semantic layer. Find why you need it in my “WHY SEMANTIC LAYER?” newsletter.
  • ERP systems are not a replacement for a data warehouse. Neither are data lakes and Big Data.
  • You shall have both organizational and self-service BI, and they should complement each other. If you lean too much toward organization BI, you’ll get a backlog of requirements. If you lean too much toward self-service BI, you’ll end up with fragmented “spreadmarts”, which is where you probably started.
  • Most of the BI effort shall go toward organizational BI to integrate data, improve data quality, and centralize business calculations. Tools come and go but this effort shall endure.
  • Agile and managed self-service BI shall fill in the gaps. It should provide a feedback loop to extend organizational BI with data that the entire organization can benefit from.

 


Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Gold Partner | Data Analytics

logo

Prologika Newsletter Fall 2019

090918_1951_DAXVariable1.pngWe all need to share. But until a couple of months ago, a Power BI training or assessment wouldn’t be complete unless I got hammered on the Power BI sharing limitations. Fortunately, Microsoft has addressed most of these and I have now a much better story to tell. And this is the subject of this newsletter.

Quo vadis, Power BI Sharing?

Power BI sharing road has been a long and winding one. I covered the gist in my blogs “Power BI Sharing is Getting Better“, “Power BI Sharing is Getting Better 2“, and “The Power BI Viewer Role“. The following table summarizes prior and standing sharing limitations. The Solution column lists the “fix” by Microsoft.

LimitationSolution
Workspace dependency on Office 365 groupsV2 workspaces don’t depend on O365 groups
No IT oversight on creating workspacesYou can now control who can create workspaces in the Power BI Admin center
Group membership limitationsV2 workspaces support all O365 group types
Coarse content access levelContributor and Viewer roles
No cross workspace sharingA dataset can be shared across workspaces
No data governanceA dataset can be promoted and certified
Power BI Premium sharing with viewers require report sharing or appsThe Viewer role supports sharing with viewers
Only one app supported per workspaceThe Viewer role deemphasizes apps
No nesting support (subfolders)

Sharing Best Practices

Given the current state of Power BI, I’d like to share some best practices for organizing and sharing content:

  1. Don’t use report and dashboard sharing as it can quickly turn into a maintenance nightmare.
  2. Create workspaces to reflect your organizational functional areas, e.g. Sales, HR, Finance. Unfortunately, workspaces still don’t support subfolders (Microsoft hinted that they are working on such a feature), so for now you must resort to a flattened list.
  3. Instead of individual user assignments, add users to security groups and then add these groups as members to the workspace. This way, when the user leaves and company or moves to another department, you only need to change the user’s group membership without making changes to the workspace security policy.
  4. Come up with data governance policy. The Power BI data governance story is work in progress, but you can set up some ground rules. For example, once a contributor believes that a dataset is ready for a broader consumption, he can promote the dataset. Then, IT can verify the dataset and certify it. Currently, Power BI supports certifying only datasets (reports and dashboards aren’t certifiable yet).
  5. This one is hard. Teach data analysts best data modeling practices. Instead of creating a dataset per report, they should create a data model that correctly represents their subject area. A data analyst should create the model once so it can support multiple reports. Once the dataset is published and certified, users can create their own reports.
  6. Although Microsoft is pushing apps very hard and adding some nice features, such as navigation, a standing limitation is that you can create only one app per workspace. So, you can’t publish multiple apps from a workspace, such as to share some reports with one group of users and another set with a different group. Instead, grant access directly to workspaces by using the Contributor and Viewer roles. The Viewer role lifts a significant limitation that forced you to use report/dashboard sharing or apps to share content out of a premium workspace with viewers. If you want to share the entire content of the workspace, you don’t have to use apps. Instead, you can simply add users or groups as viewers to the workspace.

Despite some long standing limitations, Power BI sharing is coming out of age. Follow the above practices and you’ll have now a much better way to organize and share content.


Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Summer 2019

090918_1951_DAXVariable1.pngAfter learning how to model the data properly (the most important skill), DAX would be the next hurdle in your self-service or organizational BI journey. You won’t get far in Microsoft BI without DAX. This letter shares a tip on how to identify DAX performance bottlenecks by using a recent feature in Power BI called Performance Analyzer.

Learning DAX

DAXBefore I get to Performance Analyzer, I’m excited to announce my latest book: Applied DAX with Power BI: From zero to hero with 15-minute lessons. This book was born out of necessity. In my consulting practice, I had been teaching and implementing Power BI and Analysis Services Tabular, and people were constantly asking for DAX book recommendations. Indeed, DAX is not an easy topic and has its ways to humble even experienced practitioners. There are a few good reference books out there, but they could be somewhat overwhelming for novice users. So, I turned my classroom and consulting experience into this book and designed it as a self-paced guide to help you learn DAX one lesson at a time.

This one was a different book to write. Unlike my other books, which tend to be heavy, this book is smaller (220 pages). Regardless, it took me a while to write it. The book also adopts a lesson-based approach, so the reader can learn and practice a specific DAX concept quickly. There are usually three sections in each lesson: Practice (with step-by-step guidance), Outcome (a visual that shows the result), and Analysis (explaining why DAX works this way). Most lessons are five to six pages long, and it should take no more than 15 minutes to complete the lesson’s exercises (you can download and view a sample lesson from the book page).

And if you do one lesson per day, you’ll be a DAX expert in a month!

Analyzing DAX Queries

“This report is slow!” I hope you never hear this, but the chances are that you will sooner or later. No one likes watching a spinning progress indicator and waiting for the report to show up. Tucked in the View ribbon of the Power BI Desktop recent builds is a handy tool called Performance Analyzer that can help you identify slow queries. It’s easy to use:

  1. In Power BI Desktop, click the View ribbon and check the Performance Analyzer setting. This will open the Performance Analyzer pane.
  2. Click Start Recording in the Performance Analyzer pane. Once you start recording, any action that requires refreshing a visual, such as filtering or cross-highlighting, will populate the Performance Analyzer pane. You’ll see the statistics of each visual logged in the load order with its corresponding load duration.
  3. You can click the “Refresh visuals” link in Performance Analyzer to refresh all visuals on the page and capture all queries. However, once you are in a recording mode, every visual adds a new icon to help you refresh only that visual.

PerformanceAnalyzer

Performance Analyzer captures the following information:

  • DAX query – The length of time to execute the query.
  • Visual display – How long it took for the visual to render on the screen after the query is executed.
  • Other – This is the time that the visual spent in other tasks, such as preparing queries, waiting for other visuals to complete, or doing some other background processing.

Although not as comprehensive as DAX Studio, Performance Analyzer is a quick and easy way to get an overall idea of what impacts your report page performance. Performance Analyzer is currently only available in Power BI Desktop and it’s not available in Power BI Service.


Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Spring 2019

At least five of the top 10 IT trends in 2019 identified by Gartner involve machine learning in one form or the other. The list includes autonomous things, augmented analytics, AI-driven development, digital twins, and immersive experience. Gartner also predicts that by 2022, at least 40 percent of new application development projects will have machine learning (ML). Therefore, it’s not surprising to see vendors making huge investments in predictive analytics. This newsletter focuses on two important Power BI enhancements for ML that will be available for public review soon: AutoML and integration with Azure Predictive Analytics services (Azure ML and Cognitive Services).

AutoML

The problem with machine learning is that it’s hard and there aren’t that many data scientists out there (or least ones that you can afford). The promise of AutoML is to bring predictive analytics to data analysts just like self-service BI brought descriptive analytics to business users. Power BI AutoML is based on another recent technology introduced by Microsoft: ML.NET. ML.NET is for .NET developers who want to incorporate machine learning features in .NET apps without external dependencies or tools.

As it stands, Power BI AutoML requires a premium capacity (either via Power BI Premium or Power BI Embedded). The process starts with the data analyst creating an entity containing the data to be predicted. For example, the entity might return a list of customers who have purchased a product in the past with some fields that the data analyst believes are significant to “convert” a customer into a buyer, such as demographics fields. Then, the data analyst can use AutoML to create a predictive model that forecasts the likelihood of future customers to purchase a product. This could be useful for a targeted campaign or to implement “customers who purchased this product have also purchased another product” but many other ML scenarios are possible. The process for implementing an AutoML model is simple and starts with clicking the AI Insights button on the entity page.

030319_2019_PrologikaNe2.png

This starts a wizard that walks the data analyst through a few steps that I described in more detail in my “First Look at Power BI AutoML” blog. In the same blog, I shared some ideas to simplify the experience even further that I hope Microsoft will implement.

Integration with Azure ML and Cognitive Services

While AutoML is a “quick and dirty” way for data analysts to implement predictive models, data scientists need more control. They have many tools in the Microsoft ecosystem, ranging from R and Python in SQL Server to cloud-based PaaS services, such as Azure ML. One of the easiest tool is Azure ML.However, currently Power BI doesn’t integrate natively with Azure ML experiments, so you must resort to using the Web connector to call the experiment’s web service. I demonstrate how you can do this in a singleton way (one row at a time) in chapter 11 of my latest Power BI book (you can download the code from the book page). This method is still available but it’s cumbersome to implement and it’s slow.

I describe the new Power BI integration in my blog “First Look at the New Power BI Integration with Azure ML “. It offers two important benefits:

  • It simplifies the integration with Azure ML and Cognitive Services – In the case of Azure ML, all you have to do is to select the predictive web service and map the source columns as input to the model. Power BI will invoke the web service and add new predictive columns (in this case, to predict if the customer will be a buyer and the probability for this to happen).
    030319_2019_PrologikaNe3.png
  • It’s much faster – Power BI calls the service in batch mode, which is significantly faster than predicting one row at the time.

The Cognitive Services integration works in a similar way. It allows you to augment your Power BI data with cognitive features, such as sentiment analysis and image recognition. For more information, watch the related announcements: integration with Cognitive Services and Azure ML, and the new Auto ML features (skip to 1:08:30).

The forthcoming AutoML and Integration with Azure ML and Cognitive Services will be major enhancements to the Power BI predictive analytics capabilities. Both features will be available in public preview soon.

 


Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Winter 2018

Happy Holidays! I hope you’re enjoying this special time of the year. According to Gartner, “poor data quality is hitting organizations where it hurts – to the tune of $15 million as the average annual financial cost in 2017″. Data integration is another huge challenge, you know it and I know it. Microsoft hopes that Power BI dataflows will help data analysts tackle these issues and dataflows are the subject of this newsletter. But before I give you my opinion about this newcomer, I’m excited to announce the availability of the fourth edition of my “Applied Microsoft Power BI” book – the only Power BI book that it’s updated every year to keep it up with the ever-changing world of Power BI and the Microsoft Data Platform! The book is making slowly its way to the retailers and it should be available on Amazon in the first days of 2019.

What’s a dataflow?

I’m sure that by now you’ve heard about Power Query – a tool that is integrated with Power BI Desktop to help the data analyst clean and shape the data. The issue with the Power Query though is that it’s tightly coupled with Power BI Desktop while data integration challenges sometimes need to be solved at a larger scale. Think of dataflows as “Power Query in the Cloud”. Dataflows is to self-service BI what ETL is to Power BI pros.

A dataflow is a collection of Power Query queries, also known as entities, and it lives in a Power BI organizational workspace.

For example, if I want to stage data from Salesforce, I can create a dataflow with as many entities as Salesforce tables I want to stage. Unlike Power Query, which can only output to a table in a data model, a dataflow saves its output as a pair of a CSV file (data) and JSON file (metadata) in a Microsoft provided data lake store.

122018_1838_PrologikaNe2.png

So, for the BI pros who are familiar with SSIS projects for ETL, think of a dataflow as a project and queries as SSIS packages. Just like you can deploy an SSIS project and schedule it to run at a specific time, you can schedule a dataflow, and this will execute all its queries.

The Good

There is a lot to like about dataflows. I can think of two primary self-service scenarios that can benefit from dataflows:

  • Data staging – Many organizations implement operational data stores (ODS) and staging databases before the data is processed and loaded in a data warehouse. As a business user, you can use data-flows for a similar purpose. For example, one of our clients is a large insurance company that uses Microsoft Dynamics 365 for customer relationship management. Various data analysts create data models from the same CRM data, but they find that refreshing the CRM data is time consuming. Instead, they can create a dataflow to stage some CRM entities before importing them in Power BI Desktop. Even better, you could import the staged CRM data into a single dataset or in an organizational semantic model to multiple data copies and duplicating business logic.
  • Certified datasets – One way to improve data quality and promote better self-service BI is to prepare a set of certified common entities, such as Organization, Product, and Vendor. A data steward can be responsible for designing and managing these entities. Once in place, data analysts can import the certified entities in their data models.

There is a solid architecture, vision, and investments behind dataflows. You create dataflows using a tool that you’re already familiar with: Power Query. If you know Power Query, you know dataflows. Microsoft has provided the data lake storage and pricing is included in Power BI Pro/Power BI Premium. So, you have everything you need to get started with dataflows today. Ingesting the dataflow output is very fast too because you import CSV files.

The Bad

My blog on same subject started with positioning as I was left with the impression from watching the MS presentations that Microsoft encourages business users to consider dataflows for any data integration task, from data staging to loading data warehouses and even replacing data warehouses and ETL (heard that vibe before?) But Matthew Roche clarified that “dataflows don’t replace data warehouses and ETL any more than Power BI desktop replaces a full set of BI pro tools. Dataflows complement these pro tools, and enable users who are not BI pros to fill in more gaps in a BI solution, similar to other Power BI capabilities. Although dataflows mean that self-service users can do more without help, the same patterns still apply”.  So, consider this if you hear a different propaganda and use the right tool for task at hand.

You need Power BI Premium to realize the full potential of dataflows. For some obscure reason, features like linked and computed entities are not available in Power BI Pro. I wonder how many customers will feel pushed to go Power BI Premium as more and more features are only available there. I don’t mind scalability and performance related features but incremental refresh and linked entities?

Lastly, the CSV output is both a blessing and a curse. True, ingestion is fast, but text files are not a relational database. Business users can’t make direct changes to the staged data. Further, the only option is to import the dataflow output in Power BI as DirectQuery is not available for text files.

The Ugly

As dataflows stand today (in preview), you (almost) can’t access directly the output generated by dataflows. This precludes the continuum between self-service and organizational worlds. For example, IT might need to import a certified dataset into a data warehouse. But you can only connect to a dataflow in Power BI Desktop because the CSV files are not directly accessible. True, Power BI Premium lets you bring your own data lake storage (currently a preview feature), but a better option in my opinion was to provide access to the raw data in both Power BI Pro and Premium given that this is your data lake and your data.

Prologika Newsletter Fall 2018

Semantics relates to discovering the meaning of the message behind the words. In the context of data and BI, semantics represents the user’s perspective of data: how the end user views the data to derive knowledge from it. A modeler translates the machine-friendly database structures and terminology into a user-friendly semantic model that describes the business problems to be solved. To address this need, you create a semantic model. In my “Why Semantic Layer?” newsletter I explained the advantages of an organizational semantic model. In this newsletter, I’ll discuss how Power BI extends semantic modeling and converges it on a single platform. But before I go into details and speaking of semantic models, I’m excited to announce the availability of my new “Applied DAX with Power BI” workshop and its first in-person and public enrollment class on Oct 15-16 in Atlanta! Space is limited so RSVP today.

Semantic Model Flavors

In Microsoft BI, you can implement a semantic model using Power BI Desktop, Excel (Power Pivot) and Analysis Services (Multidimensional and Tabular). The first two are typically used by data analysts, while Analysis Services is considered a professional technology. Thus, we can classify semantic models into two broad categories: personal (self-service) and organizational. Behind the scenes, Power BI Desktop, Power Pivot and Analysis Services Tabular use the same foundation and storage engine. Nevertheless, up to now there have been feature differences and a strict division between these two types.

PersonalOrganizational
AuthorData analystBI Pro
ToolPower BI Desktop, Excel (Power Pivot)SSDT and Analysis Services
ScopeNarrow (usually to solve specific need)Wide (multiple subject areas)
Implementation effortShortLonger (data warehouse, ETL, model, testing)
Data capacityLimited (up to a few million rows)Larger (millions or billions of rows)
Data qualityTrust authorTrust modeler and testers
Data centralizationMay lead to data duplicationSingle version of truth
DeploymentPower BI Service, Power BI Report ServerSSAS (on premises)

Azure Analysis Services (cloud)

ConsumersDepartment, projectPotentially entire organization

How Power BI Changes Semantic Modeling

Power BI will blur the boundary between the personal and organizational aspects of semantic modeling. First, we’ve already seen how Microsoft introduced the following “pro” features in Power BI that don’t even exist or more difficult to implement with Analysis Services:

  • Incremental refresh – Delivers the ability to refresh portions of a larger dataset, such as the last 7 days. Currently, the largest dataset size supported by Power BI Premium is 10 GB, but Microsoft has already announced that soon Power BI will support larger datasets. What this means for you is that you’d be able to deploy organizational semantic models to Power BI Premium and schedule them for incremental refresh. My blog “Notes on Power BI Incremental Refresh” provides the details on this feature.
  • Composite semantic models – A composite model has heterogenous storage, such as some tables are imported and some are DirectQuery, as I discussed in my blog “Power BI Composite Models: The Good, The Bad, The Ugly“. This brings a lot of flexibility to how you connect to data.
  • Aggregations – Aggregations are predefined data summaries to speed up queries with very large models. My blog “A First Look at Power BI Aggregations” covers Power BI aggregations.

On the tooling side of things, Power BI Desktop will also pick “pro” features, such as perspectives and displays folders. Microsoft hopes that in time Power BI Desktop will serve the needs of both data analysts and BI pros. However, the lack of extensibility and source control, as well as performance issues caused by committing every model change to the background Analysis Services instance, makes me skeptical that pros will embrace Power BI Desktop. But because Microsoft announced plans to open the Power BI Tabular management endpoint (recall that published Power BI datasets are hosted in a “hidden” Tabular server), pros can still use SSDT and community tools, such as Tabular Editor, to design and deploy their models to Power BI Premium.

In time Power BI Premium will become a single cloud platform for hosting Microsoft BI artifacts (semantic models and reports) and facilitating the continuum from personal to organizational BI. This is a great news for BI practitioners frustrated by tooling and deployment differences. At the end, the personal and organizational paths will converge without feature discrepancies. The only difference would be the scope of your organizational model and how for you want it to become “organizational”.

Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Summer 2018

Microsoft Common Data Services

BI and data integration projects often benefit from an operational data source (ODS), whose benefits and design I discussed in my “Designing an Operational Data Store (ODS)” newsletter. A corporate ODS typically fall into the organizational BI area, which means that it’s implemented and sanctioned by IT. Wouldn’t be nice to let Business stage the data needed for business applications and analytics? Of course, it would! Think of the Microsoft Common Data Services as a cloud staging database or ODS by Business and for Business. But before I discuss the details, makes sure to review our Terms of Use, which has been updated as part of our commitment to transparency and to address the requirements set forth by the new European privacy law (General Data Protection Regulation). By continuing to use the Prologika website and its online services, you consent that you have read, understand and accept the terms of the Prologika Privacy Policy. If you have any questions regarding our updated Privacy Policy, please contact us by writing to info@prologika.com.

What’s Common Data Services?

Microsoft introduced Common Data Services were introduced as a part of the reimagined Business Application Platform as a “one connected platform that empowers everyone to innovate” and to put all the data you need into a standardized data model. Common Data Services consists of two offerings: Common Data Service for Apps (CDS for Apps) and Common Data Service for Analytics (CDS for Analytics).

Why two flavors? Think of the Microsoft Common Data Service for Apps (CDS for Apps) as a cloud OLTP-like repository by Business and for Business. Officially introduced in 2016 and running on Azure SQL Database, CDS for Apps is now the entity and data model behind Dynamics 365. This is where Dynamics 365 stores its data. Because it’s transaction-oriented, it’s layered on top of SQL Server. By contrast, Common Data Service for Analytics (CDS for Analytics) is oriented towards supporting analytical requirements.

How Do They Compare?

The following table compares the two CDS types.

 CDS for AppsCDS for Analytics
Primary usageOLTPOLAP
Primary tool for loading dataPowerApps/Power QueryPower Query
Primary tool for reading dataPowerApps/Power BIPower BI
Data storageAzure SQL DatabaseAzure Blob Storage (a CSV text file per entity and a JSON file for the schema)
Power BI connectivity mechanismODataAzure Blob Storage
PricingIncluded in PowerApps plansIncluded in Power BI Pro/Premium
Storage Quota10 GB per databaseRestricted by associated app workspace quota
Add-onsLogic and validationPower BI Insights apps

Both CDS types support standardized entities, whose definitions are documented in the GitHub repository of the Common Data Model. Currently, the schema of these entities is designed and controlled by Microsoft and it’s limited to Dynamics entities, such as Account, Opportunity, and so on. However, Microsoft hopes that other vendors will provide solutions and extend the CDS schema. Of course, because CDS is your database, you can extend it with your own custom entities. Note that both CDS types target business users willing to store and analyze data in a business-friendly staging database. Over time Microsoft hopes that partners will deliver more value to CDS by implementing apps (CDS apps are like the prepacked apps that already exist in Power BI, such as for Salesforce and Dynamics). Let’s now highlight some of the differences of the two CDS flavors.

Common Data Service for Apps

The main usage scenario for CDS for Apps is to jumpstart the development of PowerApps applications with a standardized data model that you can extend to your own needs.

The Good

There is a lot to like about CDS for Apps. Let’s start with pricing. Other vendors, such as Oracle and Teradata, have similar visions and products but their offerings are very expensive. The CDS for Apps pricing is included in the PowerApps licensing model because PowerApps is the primary client for creating CDS for Apps-centered solutions. Using CDS outside selected Dynamics 365 plans (that include it already) will cost you at least $7 per user and per month. CDS for Apps is more than just a data repository. It’s a business application platform with a collection of data, business rules, processes, plugins and more. In this regard, it resembles SQL Server Master Data Services (MDS). The modeler can:

  • Define and change entities, fields, relationships, and constraints. For example, the screenshot shows a custom Device Order entity that I’ve created.
  • Business rules, such as to prepopulate Ship Date based on Order Date.
  • Secure data to ensure that users can see it only if you grant them access. Role-based security allows you to control access to entities for different users within your organization

Besides the original PowerApps canvas apps (like InfoPath forms), CDS for Apps also opens the possibility to create model-driven PowerApps applications (require PowerApps P2 plan). Model-driven apps are somewhat like creating Access data forms but more versatile. Because PowerApps knows CDS for Apps, you can create the app bottom-up, i.e. start with CDS for Apps and then generate the app based on the actual schema and data. For example, you can use PowerApps to build a model-driven app for implementing the workflow for approving a certain process. Model-driven apps are a new style of a PowerApps application that makes it easy to build entity forms, entity views, and workflows. How do you get data into CDS for Apps custom entities? Your PowerApps app can write to it. Or, you can create and schedule a project that uses Power Query (yep, the same one as in Power BI) to load data from somewhere into CDS for Apps.

The Bad

How do you get data out from CDS for App, such as to import data from some entities into a Power BI model? Microsoft has released a preview build of the Common Data Service for Apps connector for Power BI. However, this connector is even slower that the Dynamics connector. It uses the OData v4 Web API. Based on my limited tests, it took the connector about a minute to download 40,000 rows from Dynamics, clocking 10% slower than the Dynamics connector. To make things worse, the connector doesn’t support query folding, so Power BI must download the entire dataset before Power Query applies filters. Because the connector doesn’t support also REST filter and select predicates, so you can’t filter data or select a subset of columns at the source. Microsoft is actively working on improving the connector performance and it might get better in time.

Continuing down the list of limitations, CDS for Apps doesn’t support change tracking (to capture changes to a given row) and incremental loads, such as to load or refresh only the data that has changed yesterday or previous month. These are all essential features that could make ODS even more valuable.

The Ugly

For years people were complaining that after migrating from the on-premises Dynamics to the cloud, they lost the ability to connect to its database directly and they had to rely on the REST APIs (slow) or Data Export Service to export the data to an SQL Server Database (fast but requires additional effort and budget). Unfortunately, although CDS for Apps stores data in Azure SQL Database, Microsoft doesn’t expose its database directly to get data out fast and bypass the REST endpoint. When I raised this issue to Microsoft I got feedback that CDS for Apps is a business platform and there are layers on top of data to handle security, rules, calculations, and so on. However, the argument that CDS for Apps is more than just a database is nonsensical to me. Try to explain to a customer that cakes have layers and CDS for Apps has layers, and therefore getting something out of it is slow. As I mentioned, the “layered nature” of the CDS is conceptually like MDS. In fact, I see a lot of overlap. MDS also supports rules, security, etc. but it doesn’t force me to go through the web service interface if all I need is the raw data. Hence, my wish to support direct connectivity to the Azure SQL Database endpoint of CDS for Apps.

Common Data Service for Analytics

CDS for Analytics is a standard feature of Power BI so every Power BI Pro user can access it. CDS for Analytics is exposed to the end user in Power BI as datapools. A datapool is a collection of entities associated with a Power BI app workspace. An entity maps to a text file in Azure Storage. Business users will rely on Power Query to populate (manually or via a scheduled refresh) entities in CDS for Analytics. You can access the workspace datapool in the workspace content page

The Good

I can think of three primary scenarios where CDS for Apps can deliver value as it stands today:

  • Offline data staging – Let’s say IT doesn’t allow direct connectivity to LOB applications but you need to create some reports on top of this data. You can stage the data as text files into CDS for Analytics. I don’t think CDS for Analtyics would bring much value if you could connect directly to it in Power BI Desktop if direct connectivity is an option. The more you move the data, the more problems you may run into. At least for now, having apps on top of text files doesn’t look like a good reason to me but I guess we have to see what apps will become available in time.
  • Prepackaged third-party solutions – Sometime ago, a software vendor asked me how they can deploy a solution to Power BI for their customers but still retain ownership. Back then I didn’t have a good answer but CDS for Apps might be a good option now. In fact, besides the Power Query as a primary tool for loading entities, any service that can write to Azure Storage can bring data to CDS for Analytics. The ISV can write the entities as CSV files and tell CDS Analytics to “mount” the storage container. CDS Analytics can now see these mounted entities and treat them as part of the whole. Worried about protecting intellectual property? Currently only the Insight App installer would have access to the installed workspace and artifacts (other users in the organization would just see the published reports which are shared with them).
  • Prepackaged insights – Like CDS for Apps, CDS for Analytics understands the Common Data Model. Over time, Microsoft and partners can contribute prepackaged “insights” that are built on top of popular LOB apps, such as Dynamics or Salesforce.

Pricing is also right. CDS for Analytics is included in Power BI although it storage counts towards the workspace quota. Another thing I like about CDS for Analytics is that the Power BI connector is very fast unlike the CDS for Apps connector.

The Bad

As of now datapools support only a small subset of the Power Query connectors. This is probably just a temporary limitation for the preview cycle. I’d imagine that all Power BI connectors for cloud and on-premises data sources will be eventually available. Continuing on the list of limitations, like CDS for Apps, CDS for Analytics doesn’t support incremental refreshes so be careful downloading millions for rows every night.

The Ugly

CDS for Analytics promises to break silos but a datapool is associated with a Power BI workspace. This architecture fragments CDS for Analytics into Power BI workspaces. However, most users would probably require access to common entities, such as Customer, Product. Not only this is not possible but the datapool storage is also limited by the workspace quota. So, if you are a Power BI Pro user who has access to an app workspace, you’re currently limited to 10 GB storage quota which includes not only Power BI datasets but also CDS entities. I wish that CDS has no association to workspaces and it was designed a global staging area, just like Azure Storage. Microsoft has promised at some point in future to allow you to reference entities between datapools in different workspaces and create calculated entities on top of them.

The success of Common Data Services for Apps will depend largely on adoption and contributions by Microsoft partners. Although it lacks typical ODS features and fast connectivity, CDS for Apps gains in “business platform” features. CDS for Analytics and Power BI Insights are new additions to Power BI. CDS for Analytics delivers Operational Data Store (ODS) to business users that is populated and maintained by business users. Microsoft and partners can augment CDS for Analytics with Power BI Insights apps.

Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Spring 2018

Extending Power BI with PowerApps and Flow

With hundreds of vendors out there, choosing a BI tool isn’t easy. If you’ve been following me for a while, you know that I love Power BI. One of the Power BI most prominent strengths is that it’s a part of the much broader Microsoft Data Platform that spans various on-premises and cloud offerings for data storage, processing, and analysis. Thanks to this integration, you can implement real-time dashboards, extend your Power BI reports with predictive capabilities, integrate with Big Data, and much more. In this newsletter, I’ll talk about an exciting integration option: extending Power BI with business applications, thanks to its integration with PowerApps and Flow. And if you happen to be in the Microsoft office in Alpharetta, feel free to ask for a demo of the Power BI Writeback that Prologika prepared and deployed to the Microsoft environment. You’ll be directed to one of the big screens when you can test the report that I mention in this newsletter.

What’s PowerApps?

Every organization requires some sort of custom data-driven apps. And every organization is under pressure to do more with less. Have you used Microsoft Access? If so, you probably remember that Access makes it easy to create data-driven forms. But they were available only in Access and could access only Access database. You might also remember InfoPath – an app for designing, distributing, filling and submitting electronic forms containing structured data which Microsoft discontinued in 2014. Think of PowerApps as the evaluation and replacement of InfoPath. PowerApps is an Azure-hosted service that lets you build data-driven business apps that run in a browser or on mobile devices. Unlike Access and InfoPath, PowerApps supports many cloud or on-premises data sources. Connectivity to on-premises data sources requires installed a data gateway, which is the same software that Power BI uses to tunnel into your corporate network.

Targeting power users and developers, PowerApps requires minimum code and it has its own expression language. In the process of building an app, you specify where you want your data to come from. Then, you choose pre-defined layouts for your forms and link them together. You can develop the app online or by downloading a Windows-based desktop app. Developing the app is free. Sharing the app requires a plan, but if use Office 365, chances are that your O365 business plan already includes PowerApps.

What’s Flow?

As I explained in my “Going with the Flow” blog, Microsoft Flow is another cloud service for creating automation flows without writing code, similar to Zapier’s “zaps” and IFTTT’s “recipes”. Like PowerApps, Flow targets business users and developers willing to create basic multistep flows for automating tasks. For example, you might want to start a workflow when an email is received from specific recipients or containing specific words. Microsoft Flow makes it possible to automate this scenario and many, many more. Business users can connect their accounts, take advantage of pre-built templates, and create their own flows in an easy visual designer. And your PowerApps app can start a flow thanks to the fact that PowerApps and Flow work together.

How Can PowerApps and Flow Enrich BI?

After the integration with Visio, Power BI just took another leap forward with the introduction of the PowerApps custom visual. You can use this visual in your Power BI reports to bridge the Data Analytics and Developer worlds. Now your Power BI apps can integrate with Line of Business (LOB) applications in versatile ways and the app can trigger workflows. All this with minimum code! Suppose you have a Power BI report that shows sales by customers. As the user browses the data, he realizes that some changes need to be made. Granted, the user can open the appropriate business app and make the changes there, and then go back to the report to see the changes. But read-only reports and are so 20th century. Instead, why can’t we allow the user to make the changes on the report by integrating it with PowerApps? This scenario is commonly referred to as writeback.

This is exactly the approach my Customer Writeback app demonstrates. The user selects a customer in the table and the customer details show up in the Change Customer visual to the right. This is the PowerApps visual that references my Customer Writeback PowerApps app. The user can make changes in the PowerApps form and save the changes back to the underlying database. Then the user can refresh the report to see these changes. Bringing this one step further, the app can start a workflow, such as when a formal approval is required to approve the changes.

022518_1855_PrologikaNe2.png

If you have experience with PowerApps, implementing this scenario takes minutes. You can find the implementation steps and source code in my “Power BI Writeback” blog.

The Power BI integration with PowerApps and Flow opens exiting new possibilities and redefines the meaning of a report. These three technologies let you integrate your reports with “smart” applications that you can implement with almost no code! Featuring more than 100 data sources, you can use PowerApps and Flow as an integration hub to mash data from almost any place, create workflows, and then embed PowerApps forms in Power BI reports.

Teo Lachev
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

logo

Prologika Newsletter Winter 2017

Sharing Reports with External Users

Happy Holidays! I hope you’re enjoying this special time of the year – time for sharing and giving. Speaking of sharing, how does your company share BI artifacts with your B2B or B2C partners? Do you still send Excel spreadsheets or embed canned SSRS reports? If so, Power BI offers a better sharing model and I’ll show you how in this letter. And about Power BI, I’m proud to announce that the third revision of my “Applied Power BI Book” should be out in a week or so. Thoroughly updated to reflect the latest of the ever-changing cloud world of Power BI and featuring 20% new content, this book is the self-study resource your organization needs to master Power BI.

Sharing in Power BI Service

Many organizations share reports with external users for Business to Business (B2B) or Business to Consumer (B2C) scenarios. If all you need is granting some external users access to some cool interactive reports and dashboards inside powerbi.com, you can do so by just sharing the content out using dashboard sharing or apps, as you do with internal users. But there are some special considerations though so read on.

Like using Power BI for internal use, external users need to be authenticated by a trusted authority. To authenticate external users, Power BI relies on Azure Active Directory (AAD). Therefore, the external organization and user need a record in AAD. If the user doesn’t have an AAD account, the user will be prompted to create one. Let’s say Elena from Adventure Works wants to grant Matthew from Prologika access to some Power BI content. Elena creates a workspace to host the external content and then she invites Matthew. To do so, she can simply share a dashboard or create an app, and then add Matthew’s email as a recipient. Matthew receives an invitation email with a link to the dashboard or app. Matthew clicks the link to access the content. Azure AAD verifies that Prologika and Matthew have records in AAD. If so, AAD will ask Matthew to sign in with his AAD credentials and grant him access to the shared Power BI content.

About cost, Power BI licensing for external users is not much different from licensing internal users. In a nutshell, the external user must have a Power BI Pro license to access Power BI content in the sharing tenant. This license can be acquired in one of three ways:

  1. The sharing organization is on Power BI Premium – If Adventure Works is on Power BI Premium and the sharing workspace is in a premium capacity, Elena can share content to external users, just like she can share content with internal Power BI Free users.
  2. The sharing organization assigns Power BI Pro licenses – Elena can assign one of her organization’s Power BI Pro licenses to Matthew.
  3. The external organization assigns Power BI Pro licenses – In this case, Matthew has a Power BI Pro license from the Prologika’s Power BI tenant. Matthew can bring in his license to all organizations that share content with Prologika.

Sharing with Power BI Embedded

Sharing in Power BI Service is simple, but it has some important drawbacks:

  • Dependency on Azure Active Directory (AAD) – Every user must have an AAD account.
  • Not adequate support for B2C – AAD is expected to support Live IDs soon, such as outlook.com emails, but you’d need to wait for other providers. Yet, many B2C apps allow external customers to authenticate with email addresses of their choice. UPDATE 3/11/2018: Power BI Service now supports sharing using personal emails.
  • Per-user license – Despite that Power BI offers three licensing options for external users, every user must be covered by a Power BI Pro license.
  • Read-only reports – Users are limited to read-only reports. They can change existing reports or create new reports.

Collectively known as Power BI Embedded, the Power BI embedded APIs can help your developers overcome these challenges in a cost-effective way. A developer can integrate any modern web-enabled app to embed Power BI dashboards, reports, and even natural questions, so that they appear as a part of your company’s offering (see the screenshot below), instead of redirecting the user to powerbi.com. Reports not only preserve their interactive features, but also users can edit them or create new reports from scratch (if your app lets them)!

122317_1453_PrologikaNe2.png

There are two ways to acquire Power BI Embedded. If your organization is on Power BI Premium, you already have everything you need to embed content for both internal and external users (assuming a P plan as EM plans are more restrictive and for embedding only). Most smaller companies and Independent Software Vendors (ISVs) looking for embedding content to external users only, would gravitate toward the Azure Power BI Embedded plans. Starting at $750/month, there are currently six Azure plans depending on how much capacity (memory, cores, and report views) you need. The Azure plans can also result in significant cost savings because they allow you to quickly scale up and down, and even pause them! For example, if the most report activity happens within normal working hours, you can scale it down to a lower plan outside the peak period.

Have you used Power BI Embedded before? When Microsoft introduced Power BI Premium in June 2017, they also revamped Power BI Embedded. Previously, Power BI Embedded was for embedding content for external users only. It was implemented as a Microsoft Azure Service and it had its own content storage and different APIs. The problem was that the old Power BI Embedded didn’t have feature parity with Power BI Service. This all changed with the new Power BI Embedded. Think of the new Power BI Embedded as Power BI Service with its own licensing model that has the same features as Power BI Service. The old Power BI Embedded will be discontinued in mid-2018, so it’s time to migrate and modernize your apps.

Power BI Embedded can deliver tremendous value to your business partners by bringing the powerbi.com engaging experience but embedded in your portals and apps. If you need implementation help with Power BI Embedded, drop us an email at https://prologika.com/contact/. Prologika has successfully helped a few organizations integrate their apps with Power BI Embedded. Learn about the business value from one of our engagements in the Microsoft’s “ZynBit Empowers Sales with Microsoft Power BI Embedded” case study.

As you’d probably agree, the BI landscape is fast-moving and it might be overwhelming. If you need any help with planning and implementing your next-generation BI solution, don’t hesitate to contact me. As a Microsoft Gold Partner and premier BI firm, you can trust us to help you plan and implement your data analytics projects, and rest assured that you’ll get the best service. Regards,

Teo Lachev

Teo Lachev President and Owner
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics

Prologika Newsletter Fall 2017

Business Performance Management


 

As I’m writing this newsletter, Florida and Georgia (I live in Atlanta, GA) are in the midst of planning for hurricane Irma, which is expected to be one of the most powerful Atlantic Ocean hurricanes in recorded history. I hope that this won’t be the case and we don’t have to deal with a disaster of the magnitude of hurricane Harvey that took place just two weeks ago in Houston, TX. To avoid and mitigate this, a great deal of planning happens at every level: from government to citizens. Did you know that over 5 million of people in Florida were evacuated, making this the biggest US displacement of people to date? Speaking of planning, how does your company approach Business Performance Management? Do you use Excel spreadsheets or expensive high-end planning software? Next, I’ll share how Prologika helped a large organization improve its budgeting and planning process, powered by a cost-effective solution based on Microsoft BI.


Why Business Performance Management?

Wikipedia defines Business Performance Management (BPM) as “a set of performance management and analytic processes that enables the management of an organization’s performance to achieve one or more pre-selected goals”. A simpler BPM definition might be a methodology to help the company predict its performance. An integral part of a BPM strategy is a process for Budgeting, Planning, and Forecasting that typically has the following workflow:

  1. Every planning year, a planner in the Financial department creates a budget for a certain number of future periods. The budget is revised and multiple versions are proposed to management. Once budget is approved, it becomes fixed for the duration of the planning year.
  2. As actuals come in, the planner works on forecast scenarios. Multiple forecast scenarios are typically prepared as the planning year progresses over time.
  3. The planner monitors the variance between actuals and budget, actuals and forecast versions, and between forecast versions to improve the planning accuracy.

Business Needs

When it comes to Finance, nothing is simple and BPM is no exception. In fact, it very well might be that that your BPM solution might be the most complicated software you’ve ever developed. The temptation is to buy a prepackaged software but even that route would require a lot of customization and compromises. To gauge complexity, let’s look at some of the business requirements we faced.

090917_2102_PrologikaNe2.png

Similar to a Telco, this company has customers and service plans. As a B2B organization, their customers are other companies. Planning is done at the customer and service plan level, and across multiple subject areas. For example, the Excel report shows a subset of the measures for planning growth of the customer base. The planner can change the budget and forecast values of some measures, called drivers. For example, the planner can overwrite Gross Adds for future periods. Further, the planner can overwrite a rule in each cell to change how the driver cell value is computed. The rules are:

  • 0 (default rule) – By default, planned values are calculated as 3-mo average of previous periods. For example, Gross Adds for Aug 2017 for the first plan is 221 and has a Gross Adds Rule of 0. This means that the system will calculate the value as a 3-mo simple average of the values of the previous three months (221, 218, and 223). Note that a previous month might fall in the Actuals range. Also notice that recursive calculations are required, where the value of each month is based on the same measure for previous periods.
  • 1 (6-mo average) – Instead of 3-mo average, 6-mo average is used.
  • 2 (seasonality index) – The value is computed by multiplying the driver values for the past 12 months by a seasonality index.
  • Any other value – The planner enters the driver value manually.

The driver measures are used to calculate other (output) measures. For example, Average of Subscribers has a formula that is based on Gross Adds and other measures. Finally, the drivers and output measures are allocated across a Chart of Accounts dimension (not shown on the report) to give managers the financial perspective of how the company is doing.

The Solution

Previously, the company has used an Excel-based home-grown solution. However, the process was manual, lengthy, error-prone, and it couldn’t give management a quick and accurate picture of the company’s performance. A Business Analytics group was responsible for retrieving and consolidating actuals. Then, the Finance group would manually input actuals into an Excel template. Complicated Excel formulas were used to calculate drivers and output measures but even Excel had difficulty with more complex recursive measures. Most of the time and effort was spent on working the system than on business performance management. The company looked at other high-end financial solutions, with a starting price tag of $300K, plus yearly maintenance fees and consulting feeds. At the end, the company entrusted Prologika to implement the solution.

We used Analysis Services 2016 Multidimension, Initially, we considered Tabular but we realized that the solution complexity exceeded the Tabular capabilities, including:

  • No support for writeback – Currently, Tabular doesn’t writeback although third-party solution such as Power Planner can be overcome this limitation.
  • No support for recursive measures – Currently, Tabular doesn’t support recursive measures, where the measure DAX formula references the same measure.
  • No scope assignments – Tabular doesn’t support scope assignments for allocations.
  • No parent-child dimensions and hierarchy functions – Tabular doesn’t support parent-child hierarchies and functions for navigating the hierarchy, such as Parent, Children, etc.

This is a classic example of using the best tool for the job. While Tabular could be a good fit for perhaps 80% of semantic models out there, Multidimensional is probably your best bet for Financial projects and projects that require massive data volumes. No tool is perfect. “When all you have (know) is a hammer, everything looks like a nail” doesn’t work for BI.

MS BI Events in Atlanta

  • Atlanta MS BI and Power BI Group: “Did Qlik run out of BI power?”presentation by Carlos Rodriquez on 9/25
  • Atlanta MS BI and Power BI Group: “Topic: TBD” presentation by Slalom Consulting on 10/30

As you’d probably agree, the BI landscape is fast-moving and it might be overwhelming. If you need any help with planning and implementing your next-generation BI solution, don’t hesitate to contact me. As a Microsoft Gold Partner and premier BI firm, you can trust us to help you plan and implement your data analytics projects, and rest assured that you’ll get the best service. Regards,

Teo Lachev

Teo Lachev President and Owner
Prologika, LLC | Making Sense of Data
Microsoft Partner | Gold Data Analytics