Collecting and Storing Poor Performing SQL Server Queries for Analysis

By:   |   Comments (11)   |   Related: > Performance Tuning


In an ideal world all of our queries would be optimized before they ever make it to a production SQL Server environment, but this is not always the case. Smaller data sets, different hardware, schema differences, etc. all effect the way our queries perform. This tip will look at a method of automatically collecting and storing poor performing SQL statements so they can be analyzed at a later date.


With the new Dynamic Management Views and functions available starting in SQL Server 2005, capturing information regarding the performance of you SQL queries is a pretty straightforward task. The following view and functions give you all the information you need to determine how the SQL in you cache is performing:

Using the view and functions above we can create a query that will pull out all the SQL queries that are currently in the cache. Along with the query text and plan we can also extract some important statistics on the performance of the query as well as the resources used during execution. Here is the query:

    GETDATE() AS "Collection Date",
    qs.execution_count AS "Execution Count",
    SUBSTRING(qt.text,qs.statement_start_offset/2 +1, 
                 (CASE WHEN qs.statement_end_offset = -1 
                       THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2 
                       ELSE qs.statement_end_offset END -
             ) AS "Query Text", 
     DB_NAME(qt.dbid) AS "DB Name",
     qs.total_worker_time AS "Total CPU Time",
     qs.total_worker_time/qs.execution_count AS "Avg CPU Time (ms)",     
     qs.total_physical_reads AS "Total Physical Reads",
     qs.total_physical_reads/qs.execution_count AS "Avg Physical Reads",
     qs.total_logical_reads AS "Total Logical Reads",
     qs.total_logical_reads/qs.execution_count AS "Avg Logical Reads",
     qs.total_logical_writes AS "Total Logical Writes",
     qs.total_logical_writes/qs.execution_count AS "Avg Logical Writes",
     qs.total_elapsed_time AS "Total Duration",
     qs.total_elapsed_time/qs.execution_count AS "Avg Duration (ms)",
     qp.query_plan AS "Plan"
FROM sys.dm_exec_query_stats AS qs 
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt 
CROSS APPLY sys.dm_exec_query_plan(qs.plan_handle) AS qp
     qs.execution_count > 50 OR
     qs.total_worker_time/qs.execution_count > 100 OR
     qs.total_physical_reads/qs.execution_count > 1000 OR
     qs.total_logical_reads/qs.execution_count > 1000 OR
     qs.total_logical_writes/qs.execution_count > 1000 OR
     qs.total_elapsed_time/qs.execution_count > 1000
     qs.execution_count DESC,
     qs.total_elapsed_time/qs.execution_count DESC,
     qs.total_worker_time/qs.execution_count DESC,
     qs.total_physical_reads/qs.execution_count DESC,
     qs.total_logical_reads/qs.execution_count DESC,
     qs.total_logical_writes/qs.execution_count DESC

This query can be easily modified to capture something specific if you are looking to solve a particular problem. For example, if you are currently experiencing an issue with CPU on you SQL instance you could alter the WHERE clause and only capture SQL queries where the worker_time is high. Similarly, if you were having an issue with IO, you could only capture SQL queries where the reads or writes are high. Note: The ORDER BY clause is only needed if you keep the TOP parameter in your query. For reference I've included below an example of the output of this query.

Poor SQL Query Output
Poor SQL Query Output
Poor SQL Query Output
Poor SQL Query Output

Also, if you click on the data in the "Plan" column it will display the execution plan in graphical format in a new tab.

Poor SQL Query Plan

Now that we have a query to capture what we are looking for, we need somewhere to store the data. The following table definition can be used to store the output of the above query. We just have to add this line ahead of the query above to take care of inserting the result, "INSERT INTO [DBA].[dbo].[My_Poor_Query_Cache]".

CREATE TABLE [DBA].[dbo].[My_Poor_Query_Cache] (
 [Collection Date] [datetime] NOT NULL,
 [Execution Count] [bigint] NULL,
 [Query Text] [nvarchar](max) NULL,
 [DB Name] [sysname] NULL,
 [Total CPU Time] [bigint],
 [Avg CPU Time (ms)] [bigint] NULL,
 [Total Physical Reads] [bigint] NULL,
 [Avg Physical Reads] [bigint] NULL,
 [Total Logical Reads] [bigint] NULL,
 [Avg Logical Reads] [bigint] NULL,
 [Total Logical Writes] [bigint] NULL,
 [Avg Logical Writes] [bigint] NULL,
 [Total Duration] [bigint] NULL,
 [Avg Duration (ms)] [bigint] NULL,
 [Plan] [xml] NULL

Finally, we'll use the SQL Server Agent to schedule this query to run. Your application and environment will determine how often you want to run this query. If queries stay in your SQL cache for a long period of time then this can be run fairly infrequently, however if the opposite is true they you may want to run this a little more often so any really poor SQL queries are not missed. Here are a few snapshots of the job I created. The T-SQL to create this job can be found here.

Poor SQL Job Step
Poor SQL Job Schedule

That's it. Now, whenever you have spare time you can query this table and start tuning.

Next Steps

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Ben Snaidero Ben Snaidero has been a SQL Server and Oracle DBA for over 10 years and focuses on performance tuning.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Tuesday, November 5, 2013 - 9:37:34 AM - Ben Snaidero Back To Top (27392)

I think you may have a typo or missed something with cut and paste.  I was able to run the query on 2008 R2 without an error as is.


Monday, November 4, 2013 - 6:34:36 PM - Uma Umakanth Back To Top (27385)


Msg 102, Level 15, State 1, Line 24
Incorrect syntax near '.'.


CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) AS qt




I have 2008 R2.


Any help, Please?






Monday, February 20, 2012 - 7:06:35 AM - Bob Barrows Back To Top (16091)

Yes, that was my first thought. The problem is, there are so many variations of that "INSERT STATEMENT TO EXCLUDE" that tit would probably take 24 hours to run this query. Perhaps I can filter on the duration, including only the queries enwhose durations are less than those in my current top ten - yes, that appears to be the way to go. Just keep decreasing the maximum duration until I start seeing queries that I need to focus on.

Sunday, February 19, 2012 - 11:48:34 PM - Ben Snaidero Back To Top (16085)


That's the main reason I prefer doing things myself with TSQL, lets me customize things as much or as little as I want/need to.  For the example you mention above you could add a where clause in the query from my tip similar to:

WHERE SUBSTRING(qt.text,qs.statement_start_offset/2 +1,
                 (CASE WHEN qs.statement_end_offset = -1
                       THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
                       ELSE qs.statement_end_offset END -
                 )/2)   NOT LIKE '%INSERT STATEMENT TO EXCLUDE%'

This would always exclude those statements that you know there is nothing you can do to improve.


Sunday, February 19, 2012 - 1:19:00 PM - Bob Barrows Back To Top (16078)

I've used these in the past to attempt to analyze the application performance on my server with less than satisfactory results.  The top 10 worst performers are invariably the data-load queries that run once or twice per day during off-hours. If there was a way to exclude these, I would be very happy.

Of course, I'm talking about the performance dashboard results. Perhaps I could modify your example to exclude the data-load queries. Time to start experimenting I guess.

Wednesday, February 8, 2012 - 12:55:27 PM - SandraV Back To Top (15942)

Jimbo:  See Aaron's post on this subject:  Unfortunately, I had the problem before 2008 R2 SP1 and after.

Wednesday, February 8, 2012 - 10:16:50 AM - Ben Snaidero Back To Top (15941)

I have not used the performance dashboard reports so I can't say exactly what the difference is if any.  Mine is based on the DMVs as I am sure the dashboard report is as well so I'd guess there would be some overlap.  The main benefit to doing it with your own script is you can customize it as much as you want as well as save the information for later analysis.  A query may only be in the cache for a short period of time so automatically gathering and storing this information is very beneficial.

Tuesday, February 7, 2012 - 7:24:56 PM - zaim raza Back To Top (15936)

very good SQL script. the same sort of report is available in sql server performance dashboard reports. so is there any difference between this script and and that reports ?



Tuesday, February 7, 2012 - 3:04:17 PM - Ben Snaidero Back To Top (15930)

A colleague of mine verified for me that if you are using SQL 2005 Management Studio then you do in fact just get formatted XML for the plan.

Tuesday, February 7, 2012 - 11:31:40 AM - Ben Snaidero Back To Top (15928)

What version of Management Studio are you using?  I am using 2008 and get the plan in graphical format.  Unfortunately I don't have any older versions to test on.

Tuesday, February 7, 2012 - 9:02:52 AM - Jimbo Back To Top (15921)

When I click on the Plan text, I get formatted XML, not an execution plan.

get free sql tips
agree to terms