Improving SQL Server performance when using table variables

By:   |   Updated: 2010-10-12   |   Comments (7)   |   Related: > Performance Tuning


Problem

Batches or store procedures that execute join operations on table variables may experience performance problems if the table variable contains a large number of rows.

Solution

Table variable were introduced in SQL Server with the intention to reduce recompiles, however if they are used in batches or store procedures they may cause a performance issue.

Let's see how this can occur with an example.

First let's create a permanent table and populate it with 100,000 rows.

SET STATISTICS profile OFF
GO
USE tempdb
GO
IF OBJECT_ID ('TestPermTable') IS NOT NULL
DROP TABLE TestPermTable
GO
CREATE TABLE TestPermTable (a2 VARCHAR(10),c2 INT)
GO
CREATE INDEX index_TestPermTable ON TestPermTable(c2)
GO
--insert 100,000 rows into the perm table
SET NOCOUNT ON
BEGIN TRAN
DECLARE @i INT
SET @i = 0
WHILE @i < 100000
BEGIN
INSERT INTO TestPermTable VALUES ('asdfghjkli',@i)
SET @i = @i + 1
END
COMMIT TRAN
GO
--update stats
UPDATE STATISTICS TestPermTable WITH fullscan
GO

Let's join the permanent table TestPermTable with a variable table containing 100,000 rows.

SET NOCOUNT ON
DECLARE @VariableTable TABLE (c1 INT)
BEGIN TRAN
DECLARE @i INT
SET @i = 0
WHILE @i < 100000
BEGIN
INSERT INTO @VariableTable VALUES (@i)
SET @i = @i + 1
END
COMMIT TRAN

SET STATISTICS profile ON
SET STATISTICS time ON
SELECT @i=TestPermTable.c2 FROM @VariableTable INNER JOIN TestPermTable ON c1=c2
GO
SET STATISTICS time OFF
SET STATISTICS profile OFF
GO

As we can see below, when we started the batch the variable table was empty, so the optimizer estimated a low number of rows. The EstimateRows value for @VariableTable is 1 and therefore the optizmer selected a Nested Loops join.

the optimizer estimated a low number of rows
sql server execution times

An under estimation of rows caused the optimizer to pick an inefficient plan, as we can see a Nested Loops join was used with the table variable as the outer table. The Nested Loops join uses one join input as the outer input and one as the inner input. The outer loop consumes the outer input table row by row. The inner loop, executed for each outer row, searches for matching rows in the inner input table. A Nested Loops join is particularly effective if the outer input is small and the inner input is preindexed and large. In many small transactions, such as those affecting only a small set of rows, index Nested Loops joins are superior to both Merge joins and Hash joins. In large queries however, Nested Loops joins are often not the optimal choice.

The solution to our problem, is to add the recompile option to the statement that involves the table variable joining with the tables. By doing this, SQL Server will be able to detect the number of rows at recompile, because the rows have already been populated. This option is only available in SQL Server 2005 and later.

SET NOCOUNT ON
DECLARE @VariableTable TABLE (c1 INT)
BEGIN TRAN
DECLARE @i INT
SET @i = 0
WHILE @i < 100000
BEGIN
INSERT INTO @VariableTable VALUES (@i)
SET @i = @i + 1
END
COMMIT TRAN

SET STATISTICS profile ON
SET STATISTICS time ON
SELECT @i=TestPermTable.c2 FROM @VariableTable INNER JOIN TestPermTable ON c1=c2 OPTION (recompile)
GO
SET STATISTICS time OFF
SET STATISTICS profile OFF
GO

As you can see, the optimizer knows that our variable table contains 100,000 rows and consequently picks a Hash join which for this example is much faster as the statistics time shows below.

the optimizer knows that our variable table contains 100,000 rows and consequently picks a Hash join
sql server parse and compile time
Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Matteo Lorini Matteo Lorini is a DBA and has been working in IT since 1993. He specializes in SQL Server and also has knowledge of MySQL.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2010-10-12

Comments For This Article




Friday, April 12, 2024 - 10:25:10 AM - Mark Freeman Back To Top (92176)
Note that if the statement runs with high frequency, the additional CPU load from all the recompiles can add up to cause a significant additional load. Be sure to test your workload, not just the one statement or stored procedure before deciding OPTION(RECOMPILE) is an appropriate solution. Making one query faster but eating all the CPU in your server may not be a good trade.

Friday, September 18, 2020 - 9:53:58 AM - Elena Alexeeva Back To Top (86502)
OMG thank you so much!!! That easy

Thursday, August 3, 2017 - 7:26:10 AM - R Back To Top (63840)

WOW BIG HELP! I WAS WORRYING SINCE YESTERDAY WHY MY QUERY IS SLOW WHEN USING TableVariables! AND THIS "OPTION(RECOMPILE)" IS THE SOLUTION!! BIG THANKS SIR Matteo Lorini!!!


Monday, July 3, 2017 - 8:48:11 PM - Robert N Back To Top (58807)

 This was exactly the optimizer hint I was looking for! Cut my query execution time from ~3700 ms down to ~350... a 10x improvement!


Monday, May 15, 2017 - 7:36:20 PM - stiles Back To Top (55791)

 Huge improvement, thanks!

 


Tuesday, March 8, 2011 - 11:11:51 AM - Nadeem Khan Back To Top (13145)

yeah it improves alot,thnx


Tuesday, October 12, 2010 - 1:50:10 PM - Mohamed Benothmane Back To Top (10253)
Adding primary key improves the query even better

DECLARE @VariableTable TABLE (c1 INT primary key)

 















get free sql tips
agree to terms