SQL Server CTE vs Temp Table vs Table Variable Performance Test

By:   |   Updated: 2022-03-01   |   Comments (12)   |   Related: More > TSQL


Problem

In a previous article, SQL Server Temp Table vs Table Variable Performance Testing, we looked at SQL Server performance differences between using a temp table and a table variable for different DML operations. One of the comments suggested comparing these results to using a Common Table Expression (CTE) for similar operations. In this article we will go through a few simple SELECT queries in order to compare the performance in SQL Server using a temporary table, table variable and a CTE. We will compare the execution times of each query as well as the system resources needed to complete each operation.

Solution

While a CTE is a really good tool it does have some limitations as compared with a temporary table or a table variable. This biggest difference is that a CTE can only be used in the current query scope whereas a temporary table or table variable can exist for the entire duration of the session allowing you to perform many different DML operations against them. That said the CTE is still a really useful tool which can make your T-SQL code more readable as well as makes writing recursive queries much less complex as the CTE can reference itself.

Compare Temp Table, Table Variable and CTE

For this test we will go through an example using the Adventureworks2014 database and gather all the order header data for each customers most recent order.

Below is the T-SQL for each of our test query types.

-- CTE
WITH t (customerid, lastorderdate) AS 
 (SELECT customerid, max(orderdate) 
  FROM sales.SalesOrderHeader
  GROUP BY customerid)
SELECT * 
FROM sales.salesorderheader soh
INNER JOIN t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate
GO

-- Temporary table
CREATE TABLE #temptable (customerid [int] NOT NULL PRIMARY KEY, lastorderdate [datetime] NULL);

INSERT INTO #temptable
SELECT customerid, max(orderdate) as lastorderdate 
FROM sales.SalesOrderHeader
GROUP BY customerid;

SELECT * 
FROM sales.salesorderheader soh
INNER JOIN #temptable t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate

DROP TABLE #temptable
GO

-- Table variable
DECLARE @tablevariable TABLE (customerid [int] NOT NULL PRIMARY KEY, lastorderdate [datetime] NULL);

INSERT INTO @tablevariable
SELECT customerid, max(orderdate) as lastorderdate 
FROM sales.SalesOrderHeader
GROUP BY customerid;

SELECT * 
FROM sales.salesorderheader soh
INNER JOIN @tablevariable t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate
GO

Looking at SQL Profiler results from these queries (each were run 10 times and averages are below) we can see that the CTE just slightly outperforms both the temporary table and table variable queries when it comes to overall duration. The CTE also uses less CPU than the other two options and performs fewer reads (significant fewer reads that the table variable query).

Query Type Reads Writes CPU Duration (ms)
CTE 1378 0 47 497
Temp table 2146 51 109 544
Table variable 133748 51 297 578

Another Example Comparing Performance of Temp Table, Table Variable and CTE

Let's also test getting the most recent order header data for just a single customer to see if there is any difference in performance for a smaller data set.

Here are the updated queries which add a WHERE clause to each statement we tested above.

-- CTE
WITH t (customerid, lastorderdate) AS 
 (SELECT customerid, max(orderdate) 
  FROM sales.SalesOrderHeader
  WHERE customerid=27604 
  GROUP BY customerid)
SELECT * 
FROM sales.salesorderheader soh
INNER JOIN t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate
GO

--Temp table
CREATE TABLE #temptable (customerid [int] NOT NULL PRIMARY KEY, lastorderdate [datetime] NULL);

INSERT INTO #temptable
SELECT customerid, max(orderdate) as lastorderdate 
FROM sales.SalesOrderHeader
WHERE customerid=27604
GROUP BY customerid;

SELECT * 
FROM sales.salesorderheader soh
INNER JOIN #temptable t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate

DROP TABLE #temptable
GO

--Table variable
DECLARE @tablevariable TABLE (customerid [int] NOT NULL PRIMARY KEY, lastorderdate [datetime] NULL);

INSERT INTO @tablevariable
SELECT customerid, max(orderdate) as lastorderdate 
FROM sales.SalesOrderHeader
WHERE customerid=27604
GROUP BY customerid;

SELECT * 
FROM sales.salesorderheader soh
INNER JOIN @tablevariable t ON soh.customerid=t.customerid AND soh.orderdate=t.lastorderdate
GO

Looking at these SQL Profiler results (averaged over 10 executions) we again see that the CTE actually gives us the best overall performance in terms of duration and resources used. The surprising thing in this case though is that the table variable outperformed the temporary table in all categories.

Query Type Reads Writes CPU Duration (ms)
CTE 10 0 0 1
Temp table 318 1 15 12
Table variable 91 1 0 1

As mentioned at the start of the article, a CTE is not quite as versatile as temporary tables and table variables, but we’ve shown here that for individual queries that require temporary/staged data it can be a good option to improve performance. Remember though when dealing with query performance to always test with your own queries/data as there are many variables that can affect performance and different use cases could lead to another option giving better performance.

Next Steps



Get Started Now - Click here to get your free 14 day trial of SolarWinds Database Insights




get scripts

next tip button



About the author
MSSQLTips author Ben Snaidero Ben Snaidero has been a SQL Server and Oracle DBA for over 10 years and focuses on performance tuning.

View all my tips


Article Last Updated: 2022-03-01

Comments For This Article




Wednesday, June 1, 2022 - 4:55:54 PM - Marcel Back To Top (90133)
The drop table should be removed for the performance testing. drop table will happen when the sql server has time, which means that statement it can be delayed. Tilting the performance time. Used to have the drop table at the end of stored procedures, but removed it as it is not require and performance increased.Would be interesting to see the performance without the drop table

Monday, February 28, 2022 - 5:21:55 PM - Ben Snaidero Back To Top (89839)
As a few people have mentioned before there are a lot of things that can effect the performance of your particular use case, it's why I mention at the end of the tip you need to test to find out what would work best for your environment.
While I agree with @Codeman and @Scott S that you could just select directly into the table (create the index after which is a good best practice when loading data) and it would be faster, that is not always the case. This simple example shows it's better to create it before.

SELECT customerid,max(orderdate) lastorderdate INTO #temptable
FROM sales.SalesOrderHeader
GROUP BY customerid;
CREATE INDEX temptable_pk on #temptable (customerid);
drop table #temptable
GO

--cpu 30ms, reads 1511, writes 81, duration 31ms

CREATE TABLE #temptable (customerid [int] NOT NULL PRIMARY KEY,lastorderdate [datetime] NULL);
INSERT INTO #temptable
SELECT customerid,max(orderdate) lastorderdate FROM sales.SalesOrderHeader
GROUP BY customerid;
drop table #temptable
GO

--cpu 16ms, reads 1215, writes 33, duration 26ms


The main point of the tips is to give people options and test what works/performs best for their particular use case.

Monday, February 7, 2022 - 6:23:16 PM - Scott S Back To Top (89763)
Codeman has the optimal solution. SELECT INTO is much, much quicker. And when you're dealing with 100's of 1000's of rows you can measure that difference in hours.

Monday, November 22, 2021 - 12:21:38 PM - Codeman Back To Top (89482)
No need to create the temp table then insert into it. Just do a select into insead.

SELECT customerid,max(orderdate) lastorderdate
INTO #temptable
FROM sales.SalesOrderHeader
GROUP BY customerid;

Thursday, October 28, 2021 - 10:45:36 AM - Daniel Johnson Back To Top (89377)
CTEs and table variables do not scale. If you can guarantee that you are dealing with small data sets, go crazy. If not, go with temp tables. There are million reasons why this works well for larger data sets.

Saturday, May 9, 2020 - 12:53:05 AM - Alexander Kirstein Back To Top (85607)

Nice brief article enticing to dig deeper. Like why would the CTE duration be marginally better when all resources consumed are exponentially better? Also, the differences in the i/o subsystem for each, buffer memory, tempdb database on disk, indexing options. These also weigh significantly on which to use when all are viable. Good article, thank you.


Wednesday, February 27, 2019 - 9:53:00 AM - abc Back To Top (79140)

Its good article shows all the temp variable and Temp table and CTE differences clearly


Thursday, January 10, 2019 - 2:54:54 PM - Michael S Barash Back To Top (78702)

 Ben, I appreciate this aritcle.  I have been able to greatly reduce the number of reads on declared tables by putting an index on them.  Most of the extra reads were due to a default table-scan on my declared tables.  If we can greatly reduce the number of reads on table-variable with indexes, then this makes table-variables perform even better.

Thanks, Michael Barash - DBA/Senior-SQL-Developer.


Tuesday, September 4, 2018 - 11:14:22 AM - R Evans Back To Top (77360)

The speed will depend on the data.  (I haven't got AdventureWorks 2014 to hand to say how many rows would be involved in this example.)

Table variables don't have statistics so the cardinality estimator assumes there is only one row - so these work best with smaller datasets.


Tuesday, September 4, 2018 - 10:36:37 AM - Adrian Hernandez Back To Top (77358)

 Hello,

Based on your experience, would you say that CTE is always faster that using a table variable or a temp table?


Tuesday, September 4, 2018 - 9:47:01 AM - Ben Snaidero Back To Top (77357)

@R Evans

Good idea.  Actually just needed a small update to your TSQL and it executed using only 5 reads (as opposed to 10 in my example).  Updated TSQL is below.

WITH t ([SalesOrderID],[RevisionNumber],[OrderDate],[DueDate],[ShipDate],[Status],[OnlineOrderFlag],[SalesOrderNumber]
      ,[PurchaseOrderNumber],[AccountNumber],[CustomerID],[SalesPersonID],[TerritoryID],[BillToAddressID],[ShipToAddressID]
      ,[ShipMethodID],[CreditCardID],[CreditCardApprovalCode],[CurrencyRateID],[SubTotal],[TaxAmt],[Freight]
      ,[TotalDue],[Comment],[rowguid],[ModifiedDate],rowno)
AS (
SELECT *, row_number() over (partition by customerid order by orderdate desc) rowno
FROM sales.SalesOrderHeader
WHERE customerid=27604
) SELECT * from t
where rowno = 1


Tuesday, September 4, 2018 - 3:54:21 AM - R Evans Back To Top (77356)

With the CTE there's no need to join back to the cte - you can use the row_number() window function; would be interesting to compare the performance again.

WITH t (customerid,lastorderdate)
AS (
SELECT *, row_number() over (partition by customerid order by orderdate desc) rowno
FROM sales.SalesOrderHeader
WHERE customerid=27604
GROUP BY customerid
) SELECT * from t
where rowno = 1



download














get free sql tips
agree to terms