Options to Improve SQL Server Backup Performance

By:   |   Comments (7)   |   Related: > Backup


Problem

Database systems seem to be growing almost exponentially every year and as a DBA one of our primary responsibilities is to make sure this data is backed up regularly. In this tip we will look at options in SQL Server 2008 that can decrease the duration of your backups and/or reduce the overhead that this process has on your system.

Solution

Note: All tests were done on a SQL Server 2008 database with a single .mdf (16GB) and server specification as follows: 6GB RAM, 2x2.83GHz CPU, Single 1TB SCSI disk device. Test scripts were run 3 consecutive times and an average taken for each data value. Tests should be run on your own environment as results may vary based on server hardware/configuration.

For this performance test we are going to look at the following database backup parameters/options and their effect on CPU usage and backup duration. Details on each of the options/parameters can be found here

  1. multiple backup devices
  2. parameter - maxtransfersize
  3. parameter - buffercount
  4. parameter - compression

Before we get into the results of the test I want to outline a few things I came across during my testing. First, the extra memory used during the backup process can be calculated using the options selected above and is basically "maxtransfersize x buffercount + some small overhead". Unless you are using really large values for buffercount (maxtransfersize max is 4MB) this results in only a small amount of extra memory required for the backup process hence I decided not to track this during this test. The default value of buffercount parameter can also be calculated using the following formula "NumberofBackupDevices* 3 + NumberofBackupDevices + (2 * NumberofVolumesInvolved)" (also found here). In my case this meant a buffercount equal to 7, 12, 17, 22 and 27 for 1, 2, 3, 4 and 5 disk devices respectively. You can also check these values for any backup command you run by checking the SQL error log after enabling trace flags 3605 and 3213.

Related tips:


Testing backup to nul device without compression

For our first test we will test the first 3 options above using the 'nul' device as the location to eliminate any I/O or network bottlenecks to the backup location. I normally run all my backups to a CIFS share across the network so sometimes network latency and/or IO can be an issue. The script used for this test can be found here and below are the results. These first two charts show results after just varying the maxtransfersize parameter. We can see that as we increase the maxtransfersize the duration of the backup decreases, but CPU usage increases although not by a lot relatively speaking. From these results we can see that we get the best performance when using a 4MB maxtransfersize and 5 disk files for the backup.


Nul device - maxtransfersize - CPU
Nul device - maxtransfersize - duration

In the next 4 charts we will show the effects of varying the buffercount parameter and see how much if any we can improve the performance. The results below are from the tests that used a 1MB maxtransfersize (first 2 charts) and a 4MB maxtransfersize (second 2 charts) as these two values yielded the best performance in the results above.


Nul device - buffercount - 1MB maxtransfersize - CPU
Nul device - buffercount - 1MB maxtransfersize - duration

Nul device - buffercount - 4MB maxtransfersize - CPU
Nul device - buffercount - 4MB maxtransfersize - duration

From this test we can see that the larger buffercount, 47, yields the best performance regardless of the maxtransfersize parameter. One thing to note when using the smaller maxtransfersize we use less CPU to complete the process. Another thing to note is that in this case the number of disk files used for the backup seemed to be irrelevant. This can probably be attributed to the fact that in the background SQL Server increases the buffercount parameter based on the number of disk files you specify.


Testing backup to physical file device without compression

For our second test we will perform the exact same test scenario as above with the only difference being that we will write the backup to a physical file instead of to the 'nul' device. The script used for this test can be found here and below are the results.


Physical device - maxtransfersize - CPU
Physical device - maxtransfersize - duration

Physical device - buffercount - 1MB maxtransfersize - CPU
Physical device - buffercount - 1MB maxtransfersize - duration

Physical device - buffercount - 4MB maxtransfersize - CPU
Physical device - buffercount - 4MB maxtransfersize - duration

We can see from the results of this test that we get the best performance by using a large value for maxtransfersize (1MB or greater) and the default value for buffercount (which will be based on the number of disk files used for the backup). With these settings we complete our full backup of 16GB in about 98 seconds. Under the default scenario, if you were to run a backup using a single disk file with defaults for other parameters a backup would take about 138 seconds, that's quite an improvement. Also interesting to note is that when we use a physical file and vary the buffercount we don't see the nice tight CPU pattern we see with the other tests.


Testing backup to physical file device with compression

This final test also uses the same scenario as the previous 2 tests only this time we will include the 4th option listed above, compression, to all of the backup statements. The script used for this test can be found here and below are the results.


Physical device - maxtransfersize - compression - CPU
Physical device - maxtransfersize - compression - duration

Physical device - buffercount - 1MB maxtransfersize - compression - CPU
Physical device - buffercount - 1MB maxtransfersize - compression - duration

Physical device - buffercount - 4MB maxtransfersize - compression - CPU
Physical device - buffercount - 4MB maxtransfersize - compression - duration

We can see from these results that our biggest improvement comes by adding compression. By having a large value for maxtransfersize (1MB or greater) and a large value for buffercount (27 or greater) we can get this backup to run in under 60 seconds, more than twice as fast than using a single disk file with the default parameters. The only drawback to using compression, which comes out in the graphs, is the amount of CPU needed to perform compressed backups. Our initial tests without compression revealed we were using about 400-1000ms of CPU during a backup however with compression we can see that we are using anywhere from about 1000-3000ms of CPU during the backup process (I am only taking into consideration the CPU usage for the faster backup durations). Since the extra CPU usage is over the full duration of the backup I would not be concerned as an extra 2000ms of CPU over a 60 second backup is really not that significant. Another benefit from compression not shown in the charts is the size of the backup. Without compression our backup file was about 16GB, with compression this file was only 5GB. One thing to be mindful of with the decrease in the size of a compressed backup is that the data/datatypes used in your database will determine just how much disk space savings you will see when adding this option.

Summary

While this test case revealed that we were able to decrease our backup duration from 138 seconds to 54 seconds and backup file size from 16GB to 5GB with the only drawback being an increase in CPU usage from at most 1000ms to 3000ms, testing should be done in your own environment to find the optimal settings for your backups.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Ben Snaidero Ben Snaidero has been a SQL Server and Oracle DBA for over 10 years and focuses on performance tuning.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Sunday, February 12, 2023 - 3:30:08 PM - Jeff Moden Back To Top (90925)
I had reason to revisit this article so that I could copy the URL to give to someone else. With that, I'll confirm again, the original settings that I used way back circa 2011 have never needed a change. Thanks again for this well written and proof laden article, Ben.

Thursday, March 31, 2016 - 9:23:17 PM - Jeff Moden Back To Top (41110)

I know I'm a half decade late but wanted to say "Thanks" for this great article, Ben.  I'm happy to report that your findings still work! ;-)

My only warning to readers would be that if you can't guarantee physical spindles, don't split the backup to multiple files because the head chatter on a single spindle will actually make things take longer.  Keep it to 1 file per spindle.


Wednesday, June 26, 2013 - 2:59:05 PM - Fernando Dondeo Back To Top (25581)

 

Great tip. Help me a lot before to acquire new devices for company. The use the device "nul" was perfect to know how fast is my SQL Server.

 

Thanks.


Friday, November 18, 2011 - 8:54:50 AM - Jeremy Kadlec Back To Top (15161)

Ben,

This is a great tip.  I think you did a great job covering it from all angles.

Thank you,
Jeremy Kadlec


Thursday, November 17, 2011 - 10:26:42 AM - Steve S Back To Top (15148)

Ben, Very nice article!  I can make immediate use of this.  I had forgotten how useful the 'nul' device can be for comparative purposes.  Have a great day!


Thursday, November 17, 2011 - 8:10:14 AM - Ben Snaidero Back To Top (15144)

Hi Wahid,

Only additional tool I used for testing was SQL Profiler to gather the CPU and duration.  If I had included memory usage (although I stated in the tip how to calculate this) I would have had to use performance counters.  As for plotting the results I used Microsoft Excel.  I found that at first it's not the most intuitive tool to use but once you get used to it it's pretty good.

Thanks for reading.

Ben.


Thursday, November 17, 2011 - 12:29:31 AM - Wahid Back To Top (15139)

What additional tools did you use for testing, or more specifically for plotting your test results. The graphs look nice!















get free sql tips
agree to terms