Testing DAX Measures

DAX can get complex and humble even experienced BI developers. Since Microsoft left us without a proper debugger, here a couple of techniques that I use to debug DAX when going gets tough:

  1. Variables – I often break down the formula in variables. As a bonus, variables make expressions easier to read and might yield performance gains. Consider the following SalesYoY% measure that calculates the variance in sales between the current period and same period last year.
    SalesYoY% =
    VAR _LastYearSales =
    CALCULATE (
    [InternetSales],
    SAMEPERIODLASTYEAR ( 'Date'[Date] )
    )
    RETURN
    IF (NOT ISBLANK([InternetSales]), DIVIDE ([InternetSales] - _LastYearSales, _LastYearSales))

    Let’s say you believe that last year’s sales look suspicious, such as the high value for 2011. You can comment the last line and return the LastYearSales variable to investigate further.

    SalesYoY% =
    VAR _LastYearSales =
    CALCULATE (
    [InternetSales],
    SAMEPERIODLASTYEAR ( 'Date'[Date] )
    )
    RETURN
    _LastYearSales
    --IF (NOT ISBLANK([InternetSales]), DIVIDE ([InternetSales] - _LastYearSales, _LastYearSales))
  2. The EvaluateAndLog function – With the introduction of the EvaluateAndLog function in Power BI, Microsoft has provided us with a poor man’s debugger that can print the output of a DAX expression. For example, to investigate the last year’s sales, you can enclose it with EvaluateAndLog.
    SalesYoY% =
    VAR _LastYearSales =
    CALCULATE (
    [InternetSales],
    SAMEPERIODLASTYEAR ( 'Date'[Date] )
    )
    RETURN
    IF (NOT ISBLANK([InternetSales]), DIVIDE ([InternetSales] – EvaluateAndLog(_LastYearSales), _LastYearSales))

    Then, you can use the SQL Server Profiler (distributed with SSMS) or the tool mentioned in the blog to see the output. I have the SQL Server Profiler registered as a Power BI Desktop external tool using the steps in this blog so that I can conviniently launch it connected to the Power BI Desktop model. Once the profiler opens, make sure to select the “DAX Evaluation Log” event.

    In this case, the event outputs the last year’s sales broken by year because the visual slices the measure by years.

Atlanta MS BI and Power BI Group Meeting on December 5th (Automate and Improve Planning, Budgeting and Forecasting)

The Atlanta MS BI and Power BI Group is resuming in-person meetings! Please join us for the next meeting on Monday, December 5th, at 6:30 PM ET.  Your humble correspondent will show you how to implement a custom solution for automating planning, budgeting, and forecasting based on a real-life project. For more details and sign up, visit our group page.

WE ARE RESUMING IN-PERSON MEETINGS STARTING DECEMBER 5, 2022, AT THE MICROSOFT OFFICE IN ALPHARETTA. WE STRONGLY ENCOURAGE YOU TO ATTEND THE EVENT IN PERSON FOR BEST EXPERIENCE. PLEASE NOTE THAT GUESTS ENTERING MICROSOFT BUILDINGS IN THE U.S. MUST PROVIDE PROOF OF VACCINATION OR SELF-ATTEST WITH HEALTHCHECK (HTTPS://AKA.MS/HEALTHCHECK). ALTERNATIVELY, YOU CAN JOIN OUR MEETINGS ONLINE VIA MS TEAMS. WHEN POSSIBLE, WE WILL RECORD THE MEETINGS AND MAKE RECORDINGS AVAILABLE AT HTTPS://BIT.LY/ATLANTABIRECS. PLEASE RSVP ONLY IF COMING TO OUR IN-PERSON MEETING.

Presentation: Automate and Improve Budgeting, Planning, and Forecasting

Date: December 5th

Time: 6:30 – 8:30 PM ET

Place: Onsite and online

 

ONSITE

Microsoft Office (Alpharetta)

8000 Avalon Boulevard Suite 900

Alpharetta, GA 30009

ONLINE

Click here to join the meeting

Overview: Business Performance Management (BPM) is a methodology to help the company predict its performance. An integral part of a BPM strategy is a process for Budgeting, Planning, and Forecasting which is typically performed by the Finance department. When it comes to Finance, nothing is simple, and budgeting is no exception.  The temptation is to buy expensive prepackaged software but even that route would require a lot of customization and compromises.

Join this session to learn how to implement your own home-grown solution using Microsoft Analysis Services and Excel. I’ll share lessons learned from a real-life project.

Speaker: Teo Lachev is a consultant, author, and mentor, with a focus on Microsoft BI. Through his Atlanta-based company Prologika he designs and implements innovative solutions that bring tremendous value to his clients. Teo has authored and co-authored several books, and he has been leading the Atlanta Microsoft Business Intelligence group since he founded it in 2010. Microsoft has recognized Teo’s contributions to the community by awarding him the prestigious Microsoft Most Valuable Professional (MVP) Data Platform status for 15 years. Microsoft has selected Teo as one of only 30 FastTrack Solution Architects for Power BI worldwide.

Sponsor: Prologika (a Microsoft Gold Partner in Data Analytics and Data Platform and Power BI Red Carpet Partner) helps organizations of all sizes to make sense of data. Your BI project will be your best investment, we guarantee it! prologika.com

Prototypes with Pizza: Power BI latest news

Download presentation

PowerBILogo

Estimating DAX Query Completion Time

Usually, DAX queries execute very fast, like flashes in a gold seeker’s pan. Sometimes, however, you could end up with a massive DAX query that takes minutes if not hours. For example, a financial institution requested credit monitoring results to be evaluated overnight and saved in a relational database for fast retrieval. The query involved processing some 300+ measures for 40 million customers and took about an hour to complete. While working on optimizing the query and tracing its execution, I enabled the VertiPaq SE Query End event and was able to monitor how far the query got by seeing which measure is being processed. This was also useful to understand which measures are more expensive.

In general, the server doesn’t process measures in parallel. A measure may produce multiple storage queries, some in parallel and some sequentially. The VertiPaq SE Query End event would show the SQL-like query that the server sent to the storage engine, but usually you can easily correlate it to the DAX formula of the measure being processed. Thanks to the performance enhancements Microsoft is making, such as horizontal fusion, multiple measures may be coalesced into the same storage operation, and therefore a query with multiple measures might yield parallelism or could be sequential. On the other hand, there is plenty of parallelism within a Vertipaq query.

Atlanta MS BI and Power BI Group Meeting on November 7th (Data Science for Power BI Developers)

Please join us online for the next Atlanta MS BI and Power BI Group meeting on Monday, November 7th, at 6:30 PM ET.  Sandeep Pawar will join us again to show us the Power BI data science features and how you can create ML models. For more details and sign up, visit our group page.

Presentation:Data Science for Power BI Developers
Date:November 7th
Time:6:30 – 8:30 PM ET
Place:Click here to join the meeting
Overview:The use of Data Science tools and techniques has grown in the last few years and BI developers are collaborating closely with the Data Scientists. In this session, we will learn about some of the core concepts in data science, key terminologies and the overall process so that you as a BI developer can effectively collaborate with your Data Science team. You will learn about analogous features in Power BI, their pitfalls, the overall Machine Learning process and the questions you should be asking the next time you are consuming the ML model in Power BI.

 

This session won’t teach you how to create Machine Learning models, rather it will show you how you should consume the Machine Learning models as a BI developer to help your business users.

Speaker:Sandeep Pawar is a Sr. Power BI Architect at Hitachi Solutions America. He has extensive experience in creating data analytics and data science solutions. You can read more about him and his passion for data at www.pawarbi.com or follow him on twitter at @PawarBI.
Prototypes without PizzaPower BI Latest

PowerBILogo

Testing RLS with Power BI Shared Datasets

In a typical engagement, I create an organizational semantic model(s) and “report packs”, such as Sales Report Pack, Inventory Report Pack, etc. These report packs are typically implemented as Power BI reports connected to the semantic model as a shared dataset using the Power BI Datasets connector. Reports sanctioned by IT are published to a dedicated workspace, such as Corporate BI. Departmental reports are deployed to their respective workspace, such as Sales, to enforce content-level security. Usually, the semantic model has row-level security (RLS) roles defined to enforce restricted access to data depending on the identity of the interactive user.

Although not immediately obvious, here is how you can test RLS to ensure that connected reports produce expected results under someone else’s identity directly in Power BI Service:

  1. In powerbi.com, go to the workspace where the shared dataset is published.
  2. In the dataset page, click the Datasets tab, and then click the ellipsis button (…) next to the shared dataset. In the dataset Settings, click Security.
  3. Expand the … button next to the Restricted Access role, and then click “Test as role”. This will open whatever report is included in the shared dataset pbix file (I typically have some reports there for testing).
  4. You can expand the Restricted Access dropdown and enter the user email of the user whose RLS permissions you want to test or the role you want to apply. In the screenshot below, I’m testing under my identity applying the rules in the Restricted Access role. If I wanted to test what a specific user would see, I’d type in that user’s email.
  5. To test other reports for this user, expand the powerbi.com left-navigation pane and then click the desired report. Power BI will navigate you to that report, but it will honor RLS.

Correlating Analysis Services Errors with Measures

This blog builds upon my previous “Resolving Tabular Conversion Errors” and applies to Analysis Services in all flavors (Power BI, MD, and Tabular). In the scenario I described in the previous blog, the server at least told us the name of the offending measure in the error description. But sometimes you might not be that lucky. For example, recently I got this error when running a DAX query requesting many measures: “Microsoft OLE DB Provider for Analysis Services.” Hresult: 0x80004005 Description: “MdxScript(Model) (2000, 133) Failed to resolve name ‘SYNTAXERROR’. It is not a valid table, variable, or function name.” All we know is that there is a syntax error in some measure but good luck finding it if you have hundreds of measures in the query and your model. However, the (2000,133) section references the line number and column number in the MDX script (Yeap, MDX even if you use Tabular), so if we can get the script, we might be able to correlate the error.

Getting that script is elusive as the only way I know off for Tabular models is to trace the “Execute MDX Script Begin” or “Execute MDX Script End” events. But these events are generated only the first time a user with a given set of role permissions connects to a cube after processing, clearing the cache, or restarting the server. So, after you connect the SQL Server Profiler to the cube and check either of these two events (you must click Show All Events because the events are not shown by default), you can execute Clear Cache or process a table. Then, connect to the cube and execute any MDX or DAX query. In the Profiler, locate the “Execute MDX Script Begin” event. The payload should start with CALCULATE followed by all measure definitions.

CALCULATE;
CREATE MEMBER CURRENTCUBE.Measures.[__Default measure] AS 1; ALTER CUBE CURRENTCUBE UPDATE DIMENSION Measures, Default_Member = [__Default measure];
CREATE
MEASURE'Date'[Days Current Quarter to Date]=COUNTROWS( DATESQTD( 'Date'[Date]))
MEASURE'Date'[Days in Current Quarter]=COUNTROWS( DATESBETWEEN( 'Date'[Date], STARTOFQUARTER( LASTDATE('Date'[Date])), ENDOFQUARTER('Date'[Date])))
MEASURE'Internet Sales'[Internet Distinct Count Sales Order]=DISTINCTCOUNT([Sales Order Number])
….

Copy that payload and paste it in an editor that has line numbers, such as Notepad++. Then go to the line number (you can press Ctrl+G in Notepad) mentioned in the error description.

Using this technique, I was able to narrow down the measure and discovered that I’ve missed a comma in the DIVIDE DAX function. I’ve made the change in the Tabular Editor, but it didn’t catch the syntax error.

Resolving Tabular Conversion Errors

A scheduled SSIS job that executes a massive DAX query to an on-prem Tabular server (Power BI can also generate this error) one day decided to throw an error “Source: “Microsoft OLE DB Provider for Analysis Services.” Hresult: 0x80004005 Description: “MdxScript(Model) (2020, 98) Calculation error in measure ‘Account Snapshot'[Average utilisation % of all CR active current accounts last 3 months]: The result of a conversion or arithmetic operation is either too large or too small.” At least we know the offending measure, but which row is causing the error? The query requests some 300+ measures for 120 million customers, so I thought someone might find the troubleshooting technique useful. Let’s ignore what the measure does for now except mentioning that it performs a division of two other measures.

We can use the DAX ISERROR function to check if a measure throws an error. So, the first step is to wrap the measure with ISERROR and execute the query for all customers.

EVALUATE
FILTER (
 CALCULATETABLE (
 ADDCOLUMNS (
VALUES ( 'Customer'[Customer Registry ID] ),
"x", ISERROR ( [Average utilisation % of all CR active current accounts last 3 months] )
 ),
 FILTER ( VALUES ( 'Date'[Date] ), 'Date'[Date] = DATEVALUE ( "2022-10-14" ) )
 ),
 [x]
)

The query uses the ADDCOLUMNS function to add the measure so that it’s evaluated for each customer as of the date when the job ran. Make sure to add dependent measures that references the measure in the error description. ISERROR will return TRUE if the measure throws an error. The final step is to wrap CALCULATEDTABLE with FILTER that returns only the offending customer(s) where the measure is TRUE.

Using this technique, I’ve managed to narrow the error to data quality issues concerning large negative credit limits for some customers. BTW, the Tabular engine raises this error when converting a double precision floating point number to an integer, but the former is out of range of valid integer numbers.

Embedding Power BI Reports in Internal Portals

This question pops up over and over. What’s the easiest way to embed a Power BI or paginated report hosted in Power BI Service in an internal portal, such as a custom site developed by your team or on-prem SharePoint Server, so internal users can see all reports in one place? Use the “Embed report” feature that will give you a link or iframe code that you can readily add to the portal without any coding (notice that there is a SharePoint Online option because SharePoint Online has a special webpart for this purpose).

Will you get a single sign-on (SSO) so that the interactive user credentials automatically flow to Power BI? Nope. To the best of my knowledge, Power BI doesn’t support single sign-on even if your organization has extended its active directory to Azure, but let me know if you have found a workaround. The first time the user opens a Power BI report in the browser session, the user will get prompted to enter credentials. Then the user won’t be asked to authenticate when viewing subsequent reports in the browser session because the authentication token will be cashed in the browser session. The browser can also mitigate the issue by saving the user credentials until the password changes.

What if you want to avoid the prompt under any circumstance? Write code and use Power BI Embedded. Down the custom code rabbit hole we go… The upside is that you’ll get more control over the embedded reports thanks to the Power BI JavaScript library, such as if you want to replace the Filter Pane with your filtering mechanism.

Atlanta MS BI and Power BI Group Meeting on October 3rd (Auto-provision embedded report delivery for your customers)

Please join us online for the next Atlanta MS BI and Power BI Group meeting on Monday, October 3rd, at 6:30 PM ET.  Tom Huguelet (Lucient) will present a real-world case study and the challenges involved in bringing a Power BI application from initial Excel-based POC to an ISV platform that auto-provisions for new clients, using Power BI Embedded Azure Premium Capacity.  For more details and sign up, visit our group page.

Presentation:Auto-provision embedded report delivery for your customers
Date:October 3rd
Time:6:30 – 8:30 PM ET
Place:Click here to join the meeting
Overview:In this session we’ll look at a real-world case study and the challenges involved in bringing a Power BI application from initial Excel-based POC to an ISV platform that auto-provisions for new clients, using Power BI Embedded Azure Premium Capacity.

We will cover:

•            Initial Business Case and Project Team

•            Testing Embedded in a Sandbox Environment (Authentication)

•            Enabling User-Customized Experiences in Embedded

•            Using REST APIs for automatic Provisioning

•            Lessons learned and challenges encountered

Speaker:Tom Huguelet has been engaged in using Business Intelligence, and Dashboards & Analytics to enable Corporate Performance Management since 1999. While coming from a technological background, Tom has always focused on the people and process aspects of how Business Intelligence can create positive changes within an organization. In his role as a Practice Director, he leads and mentors technical teams as well as advises business stakeholders on how to build the cross-disciplinary teams necessary to be successful with Business Intelligence. Tom holds several Microsoft certifications including Microsoft Certified Trainer, is a frequent speaker at industry events, has co-authored 3 books on SQL Server and Data Warehousing, and helped start a BI Roundtable group in Chicago.
Prototypes without PizzaPower BI Latest

PowerBILogo

Power BI Visual Help Tooltips

Want to display a visual-left hint to perplexed users that explains what your visual is supposed to reveal? Like me, you’ve probably missed the handy Power BI help tooltips feature that allows you to pop up some helpful text for each visual. On the Format tab of the visualization pane, expand the Help Tooltip section, and enter the text. Then, a question mark glyph shows up in the visual header when the user hovers. Besides showing static text, you could alternatively redirect the user to a report page that could provide more context.

Unfortunately, the tooltip tip can’t be expression-based due to a long-standing and overdue Power BI limitation that only a small set of properties that can be bound to measures.