Extracting a DAX Query Plan With Analysis Services 2016 Extended Events

By:   |   Comments (3)   |   Related: > Analysis Services Performance


In SQL Server 2016 we have the option to start an Extended Events session using SQL Server Management Studio. I would like to know how I can use this solution to find the query plan for a DAX query, just like we can for T-SQL statements.


Extended Events is a light-weight tracing and event capturing solution for SQL Server and Analysis Services. Itís the replacement for SQL Server Profiler (which is still around though) and was introduced in SQL Server 2008. In SQL Server 2012, Extended Events (xEvents) were included for Analysis Services as well, but there was no graphical user interface. SQL Server 2016 introduced a user interface for xEvents in SSAS. In this tip, we will use xEvents to find the DAX query plan so it can be used for query optimization purposes.

If you would like to learn more about xEvents for SSAS, you can check out these tips:

The Test Query

We will use a DAX query against the WideWorldImporters Tabular model. You can find out more about this model in the tip Using Calculated Tables in Analysis Services Tabular 2016 for Aggregate Tables Ė Part 3. We execute the following DAX query against the model:

    SUMMARIZECOLUMNS('City'[Sales Territory], "% of Grand Total", 'Order'[% of Grand Total])
  [% of Grand Total] DESC, 'City'[Sales Territory]

In the latest release of SQL Server Management Studio you can execute DAX queries:


This query is the equivalent of the following visual in Power BI Desktop:

bar chart

The ď% of Grand TotalĒ is calculated with the following formula:

% of Grand Total:= VAR GrandTotal = CALCULATE([Total Sales],ALL('Order')) -- our variable
      RETURN DIVIDE([Total Sales],GrandTotal) -- when using variables, use the RETURN keyword to return the result of the formula

It was introduced in the tip Enhancements for the DAX language in Analysis Services 2016.

Starting the xEvents Session

The following xEvents XMLA statement will create an xEvents session that will track the following events:

  • DAXQueryPlan - which as the name suggests, tracks the query plans
  • QueryEnd - which will give an indication of the execution time and it also contains the actual DAX query
  • VertipaqSEQueryEnd - which tracks the storage engine (SE) queries. Vertipaq was the initial codename for the Power Pivot technology (which also runs behind the scenes in SSAS Tabular, Power BI Desktop and columnstore indexes).
<Create xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
      <XEvent xmlns="http://schemas.microsoft.com/analysisservices/2011/engine/300/300">
        <event_session name="test" dispatchLatency="0" maxEventSize="0" maxMemory="4" memoryPartition="none" eventRetentionMode="AllowSingleEventLoss" trackCausality="true" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
          <event package="AS" name="DAXQueryPlan" />
          <event package="AS" name="QueryEnd" />
          <event package="AS" name="VertiPaqSEQueryEnd" />
          <target package="package0" name="event_stream" />

The output of the session is sent as an event stream, which you can watch live in SSMS:

watch live data

Finding the Query Plan

Letís start the live stream and run our query. As you can see, 5 events have been captured:

query end event

The QueryEnd event contains the DAX query, as indicated in the screenshot, but also the duration which was only 30ms. Letís take a look at the DAXQueryPlan event to find the query plan. There are actually two plans: the logical plan and the physical plan. The logical plan can be found in the first DAXQueryPlan event.

Double clicking on the TextData field gives a pop-up with the full textual DAX query plan:

logical query plan

Unfortunately, thereís no graphical query plan like we have for T-SQL queries. The plan looks quite messy, so letís format it a bit:

logical query plan formatted

When we remove all the parameters, we have the following left:

logical query plan no params

In the logical query plan, we can recognize our original query which calculated the percentage of grand total for the sales territories, first ordered by descending result and then by sales territory. The percentage of grand total was calculated by storing the grand total of [Total Including Tax] in a variable first (which corresponds with VarScope) and then dividing the sum of Total Including Tax by this grand total. All of this is directly reflected in the logical query plan.

The physical query plan translates the logical plan into physical operators. It looks like this:

physical DAX query plan

Letís format the plan a bit:

physical DAX query plan formatted

When we remove all the parameters, this is what we have left:

physical DAX query plan no params

We can also take a look at the storage engine queries. There are two xmSQL queries (which are the queries used by the storage engine):

    [City (25)].[Sales Territory (109)] AS [City (25)$Sales Territory (109)]
   ,SUM([Order (22)].[Total Including Tax (102)]) AS [$Measure0]
FROM [Order (22)]
LEFT OUTER JOIN [City (25)] ON [Order (22)].[City Key (86)]=[City (25)].[City Key (104)];
    [City (25)].[Sales Territory (109)] AS [City (25)$Sales Territory (109)]
   ,SUM([Order (22)].[Total Including Tax (102)]) AS [$Measure0]
FROM [Order (22)]
LEFT OUTER JOIN [City (25)] ON [Order (22)].[City Key (86)]=[City (25)].[City Key (104)];
[Estimated size (volume, marshalling bytes): 13, 208]

As you can see, they have a structure very familiar to regular SQL. One query is most likely to retrieve the value for the variable (the grand total) and the other query to retrieve the values of [Total Including Tax] for the different Sales Territories. This is why using variables can optimize the performance of your DAX queries: only one storage engine query is used to retrieve the value, which is then re-used throughout the entire formula. Without variables, multiple storage engine queries are needed to retrieve the same value multiple times.

All of this information can also be retrieved using SQL Server Profiler or by using DAX Studio.

Next Steps

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Saturday, October 29, 2022 - 4:38:14 AM - David Back To Top (90646)
Is it possible to capture a DAX Physical Query Plan that has mor than 10.000 rows?
After line 10.000 it just says "More operators skipped..." and the rest of the Query Plan is truncated.

Same happens when I was trying to export the JSON file in DAX Studio.
I was wondering if this is a parameter in SSAS that can be increased somehow.

Wednesday, October 4, 2017 - 3:25:43 PM - Koen Verbeeck Back To Top (66903)

Hi Marco,

thanks for your valuable input!
Of course this article is for illustrative purposes (I believe I used less events than in the chapter in your book) and in production scenarios you definately need to be more selective, so you certainly have a good argument. :)


Wednesday, October 4, 2017 - 12:56:33 PM - Marco Russo Back To Top (66897)

Just take care of one thing. Using Extended Events is the way to go to capture events like QueryEnd, so you capture all the events that runs on a server. Getting all the Vertipaq SE Query is not a good idea because of the volume, but it still doable. Getting the Query Plan for a long time could slow down the server, because a query plan could be up to 10,000 rows (it is cutted if it is larger than that), which means that it could be easily 1MB or more of text. Even if extended events are fast, the issue is that generating such a string is expensive and could slow down the server, too - not to mention the space required to archive it.

For this reason:

  • We have a separate Query Plan filter in DAX Studio
  • We use Profiler Events in DAX Studio instead of extended events: we are compatible with Power Pivot and older versions of Analysis Services in this way

We evaluated to use Extended Events in DAX Studio, but we would actually lose some compatibility without gaining much on everything else.

However, as I said, using Extended Events is the right choice for capturing events for a long time. Just don't capture too many query plan events!

get free sql tips
agree to terms