Testing SQL Server Performance of Database Inserts and Updates

By:   |   Comments (2)   |   Related: > TSQL


Problem

With SQL Server there are many different options available when it comes to inserting and updating data in a table. This tip will look at the performance differences for a few of these options.

Solution

SQL Server Insert Test Setup

In order to test our INSERT statements we will use the T-SQL code below to create a simple heap table. I purposely left out any indexes/keys/constraints so we can just measure the performance of the INSERT. Any constraint checks or index updates would have to be done regardless of the method of insert so we can exclude them as they would not have any effect on the performance between the INSERT methods.

-- Create TestTable for INSERT test
CREATE TABLE TestTable (col1 int, col2 int, col3 varchar(20), col4 varchar(20));
GO

The different methods of INSERTing data to this table we will test are as follows:

  • Multiple calls to a regular single row insert
  • Multiple row insert with a single statement
  • Bulk insert

For each type of statement we will compare inserting 10, 100 and 1000 records. I've created scripts for each of these cases, including data files for the bulk insert, which you can download from the following link. Note for this test we are not going to test inserting more than 1000 records as this is the maximum number of rows allowed for the multiple row insert with a single statement. The last thing we will need in order to run this test is a script to call each of the test cases. This script is shown below and can be called from SSMS, but notice the use of the :r command which means you will have to run this script in sqlcmd mode for it to execute successfully.

-- change to sqlcmd mode
truncate table TestTable
GO
BULK INSERT dbo.TestTable FROM 'C:\scripts\TestTable10.dat' WITH (DATAFILETYPE='native')
GO
truncate table TestTable
GO
BULK INSERT dbo.TestTable FROM 'C:\scripts\TestTable100.dat' WITH (DATAFILETYPE='native')
GO
truncate table TestTable
GO
BULK INSERT dbo.TestTable FROM 'C:\scripts\TestTable1000.dat' WITH (DATAFILETYPE='native')
GO

truncate table TestTable
GO
:r ".\scripts\TestTableIndividual10.sql"
GO
truncate table TestTable
GO
:r ".\scripts\TestTableIndividual100.sql"
GO
truncate table TestTable
GO
:r ".\scripts\TestTableIndividual1000.sql"
GO

truncate table TestTable
GO
:r ".\scripts\TestTableMultiple10.sql"
GO
truncate table TestTable
GO
:r ".\scripts\TestTableMultiple100.sql"
GO
truncate table TestTable
GO
:r ".\scripts\TestTableMultiple1000.sql"
GO

SQL Server Insert Test Results

Once the script has been run and the performance statistics collected using SQL Profiler we can take a look at the results which are below.

Query

# of Records

CPU (ms)

Reads

Writes

Duration (ms)

Individual Insert 10 0 23 1 33
100 47 113 0 158
1000 171 1057 4 335
Multiple Row Insert 10 0 23 0 1
100 0 113 0 3
1000 16 1105 0 6
Bulk Insert 10 0 27 2 5
100 0 118 1 13
1000 0 1145 8 20

As you would expect the individual inserts statements performed the worst by quite a large margin when looking at the total duration and used marginally more CPU than the other two options. This can probably be attributed to the overhead of having a separate call for each insert. I didn't however expect the multiple row insert to perform so much better than the bulk insert, almost 4 times faster in each case. The only drawback to this is that this type of statement will only work on up to 1000 records and can't be used for larger data loads without breaking them up into 1000 record batches.

SQL Server Update Test Setup

Next let's move on to an example using UPDATE statements. For this test we will create a test table with a primary key (to be used by the update statements) and load some sample data into it. The T-SQL code to perform these steps are below.

-- Create TestUpdTable for UPDATE test
CREATE TABLE [dbo].[TestUpdTable](
    [pk] [int] IDENTITY(1,1) PRIMARY KEY,
	[col1] [int] NULL,
	[col2] [int] NULL,
	[col3] [int] NULL,
	[col4] [int] NULL,
	[col5] [int] NULL,
	[col6] [datetime] NULL,
	[col7] [datetime] NULL,
	[col8] [datetime] NULL,
	[col9] [datetime] NULL,
	[col10] [datetime] NULL,
	[col11] [varchar](20) NULL,
	[col12] [varchar](20) NULL,
	[col13] [varchar](20) NULL,
	[col14] [varchar](20) NULL,
	[col15] [varchar](20) NULL
) ON [PRIMARY]
GO

DECLARE @val INT
SELECT @val=1
WHILE @val < 100000
BEGIN
  INSERT INTO [dbo].[TestUpdTable]
     ([col1],[col2],[col3],[col4],[col5],
      [col6],[col7],[col8],[col9],[col10],
      [col11],[col12],[col13],[col14],[col15])
    VALUES
     (@val,@val,@val,@val,@val,
      getdate(),getdate(),getdate(),getdate(),getdate(),
      'test' + cast(@val as varchar),'test' + cast(@val as varchar),
      'test' + cast(@val as varchar),'test' + cast(@val as varchar),
      'test' + cast(@val as varchar))
  SELECT @val=@val+1;
END;

For this test we are going to perform an update of 9 columns in the table first using a single statement and then using an individual statement for each update. I am sure if you have been a DBA for long enough you have come across an application that performs updates like this (i.e.. an individual update for each field edited on a form). Now I think it's safe to say which update will perform better but let's see how much better. Below is the T-SQL code for each of these updates.

UPDATE TestUpdTable SET col1='23' WHERE pk=10
UPDATE TestUpdTable SET col2='25' WHERE pk=10
UPDATE TestUpdTable SET col3='27' WHERE pk=10
UPDATE TestUpdTable SET col6=getdate()-1 WHERE pk=10
UPDATE TestUpdTable SET col7=getdate()-1 WHERE pk=10
UPDATE TestUpdTable SET col8=getdate()-1 WHERE pk=10
UPDATE TestUpdTable SET col11='test23' WHERE pk=10
UPDATE TestUpdTable SET col12='test25' WHERE pk=10
UPDATE TestUpdTable SET col13='test27' WHERE pk=10
GO

UPDATE TestUpdTable
   SET col1='23',col2='25',col3='27',
       col6=getdate()-1,col7=getdate()-1,col8=getdate()-1,
       col11='test23',col12='test25',col13='test27'
 WHERE pk=15
GO

SQL Server Update Test Results

Looking at the SQL Profiler statistics below it's pretty obvious just how much better the performance is when performing this update using a single statement. The single update statement performed fewer reads and completed much faster than the individual update statements. I think this result makes a good case for using some dynamic sql within your application so as many column updates as possible can be merged into a single statement.

Query

CPU (ms)

Reads

Writes

Duration (ms)

Individual Column 0 27 0 28
Multiple Column 0 3 0 1

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Ben Snaidero Ben Snaidero has been a SQL Server and Oracle DBA for over 10 years and focuses on performance tuning.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Thursday, March 24, 2016 - 5:46:11 PM - Ben Snaidero Back To Top (41053)

Hi Tony,

The version should not really matter for a test like this but my test was run on SQL 2016 CTP3.  I actually ran about 10 tests of each script and threw out the highs and lows and then took an average so as to get rid of any anomalies.  You may want to try doing the same.  I am just running on a small desktop so I don't want any background service that kicks in to skew my test results.

Thanks for reading.

Ben.


Thursday, March 24, 2016 - 12:44:31 PM - Tony Green Back To Top (41047)

Hi,

I tried your test on a SQL Server 2012 SP 3 and got significantly worse results for the TestTableMultiple1000.sql script than individual inserts.  775ms for multiple vs 126ms for individual inserts.  The 10 and 100 row insert performance was as expected.

Which SQL Server version did you test with?

Thanks,

Tony Green















get free sql tips
agree to terms