By: Richard Vantrease | Comments (13) | Related: More > Database Administration
Problem
I have a large database on a SQL Instance with more that 70% free space due to data cleanup. Unfortunately the SQL instance is running out of disk space and I don't have any free drive bays left to add space. The database is not expected to ever need the free space that it has available because data retention policies have changed. Therefore, in this case it makes sense to shrink the database files, and re-claim the space.
I am hoping to recover more than 200 GB of free space from the database I will shrink. I have tested the amount of time it takes to shrink the database by 100 MB to be roughly 10 minutes. This will clearly take quite some time, so I don't want to just tell SQL to shrink the database 200 GB and walk away. What is a good way to shrink a database roughly 200 GB?
I should note, that normally it is a VERY BAD idea to shrink a database. Usually you would be much better off to just add more storage than to shrink an existing database. This article explains why not to shrink a database: Why you should not shrink your data files. Essentially the problem is that when SQL Server try's to shrink a database, it first clears space at the end of the file. It does this by moving the data closest to the end of the file to the free space closest to the beginning of the file. It does this until it has cleared enough space to truncate the file, thereby shrinking it. This can cause near perfect fragmentation as Paul points out. Here is also a tip from MSSQLTips.com Andy Novick on Issues with running DBCC SHRINKFILE on your SQL Server data files.
Solution
Let's start by getting size and free space information for our database. In my previous tip Check SQL Server Database Consistency and Backups Without Impacting Performance I demonstrated how to retrieve a database object using PowerShell and Microsoft SQL server Management Objects (SMO). Arshad Ali talks about SMO in his tip: Getting started with SQL Server Management Objects (SMO). We can use the SMO database object to get size and free space information for the database.
For my example, I have created a sample database called MyBigDB that is 50 GB with roughly 153 MB used.
[system.Reflection.Assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") $server = New-Object Microsoft.SqlServer.Management.Smo.Server 'V2012' $Database = $Server.Databases['MyBigDB'] $Database.Size
Unfortunately this is not the size property I am looking for. What I really need is the size and free space information for a file. To get to that I need to first access the filegroups for the database, in this case there is only one filegroup which is the PRIMARY filegroup. Luckily for us, the SMO database object has a filegroups property which is a collection of filegroup objects. The filegroup object has a files property, which is a collection of datafile objects. In our case we need the PRIMARY filegroup, and we need the only file in that file group which has a logical name of MyBigDB.
[system.Reflection.Assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") $server = New-Object Microsoft.SqlServer.Management.Smo.Server 'V2012' $Database = $Server.Databases['MyBigDB'] $Filegroup = $Database.filegroups['Primary'] $File = $Filegroup.Files['MyBigDB']
Now I have the data file that I need, I can get the size and freespace information I need. A datafile object has three size properties:
- AvailableSpace - Gets the amount of available space in the data file in KB.
- Size - Gets or sets the current size of the data file in KB.
- UsedSpace - Gets the amount of used space in the data file in KB.
The properties AvailableSpace, Size and UsedSpace are in KB. To convert them to MB, divide them by 1024. Here is what I have so far:
[system.Reflection.Assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") $server = New-Object Microsoft.SqlServer.Management.Smo.Server 'V2012' $Database = $Server.Databases['MyBigDB'] $Filegroup = $Database.filegroups['Primary'] $File = $Filegroup.Files['MyBigDB'] 'Datafile size properties' $File.AvailableSpace/1024 $File.Size/1024 $File.UsedSpace/1024 GAC Version Location --- ------- -------- True v2.0.50727 C:\Windows\assembly\GAC_MSIL\Microsoft.SQLServer.Smo\11.0.0... Datafile size properties 49846.3125 50000 153.6875
My sample database (MyBigDB) is 50 GB with just over 49.8 GB free. MyBigDB only has one data file MyBigDB (not really a good idea, but makes my tip much simpler). I could just create a DBCC ShrinkDB command to shrink the database to 160 MB.
DBCC SHRINKFILE ('MyBigDB', 160)
Unfortunately, this would mean waiting while all of the freespace is cleared, then the file would be shrunk. I don't like to do this because it means you don't recover any freespace until all of it is available. In some cases things are getting pretty urgent, and waiting a long time would be a not good thing. What I prefer to do instead is shrink a small amount at a time so that I get back each small chunk as soon as it becomes available. The following DBCC SHRINKFILE commands would accomplish this, but I would have a couple hundred more commands to write and run.
DBCC SHRINKFILE (N'MyBigDB' , 49800) DBCC SHRINKFILE (N'MyBigDB' , 49600) DBCC SHRINKFILE (N'MyBigDB' , 49400) DBCC SHRINKFILE (N'MyBigDB' , 49200) ... ... ... DBCC SHRINKFILE (N'MyBigDB' , 160)
Let's create a quick script to accomplish this task without all the repetition.
param ( [Parameter(Mandatory=$True)][string]$SQLInstanceName, [Parameter(Mandatory=$True)][string]$DatabaseName, [Parameter(Mandatory=$True)][string]$FileGroupName, [Parameter(Mandatory=$True)][string]$FileName, [Parameter(Mandatory=$True)][int]$ShrinkSizeMB, [Parameter(Mandatory=$False)][int]$ShrinkIncrementMB = 10 ) [system.Reflection.Assembly]::LoadWithPartialName("Microsoft.SQLServer.Smo") $server = New-Object Microsoft.SqlServer.Management.Smo.Server $SQLInstanceName $Database = $Server.Databases[$DatabaseName] $Filegroup = $Database.filegroups[$FileGroupName] $File = $Filegroup.Files[$FileName] $SizeMB = $File.Size/1024 "Starting Size = $SizeMB" if($SizeMB - $file.AvailableSpace -le $ShrinkSizeMB) { while ($SizeMB -ge $ShrinkSizeMB) { $File.Shrink($SizeMB - $ShrinkIncrementMB, [Microsoft.SQLServer.Management.SMO.ShrinkMethod]::Default) $Server.Refresh() $Database.Refresh() $File.Refresh() $SizeMB = $File.Size/1024 "Size after shrink = $SizeMB" } }
Notice that I created some parameters that can be used so the script is re-usable. Parameters that are marked as mandatory will prompt the user for a value when the script is run if one is not provided for them. Also notice that after each shrink operation, I refresh the server object, the database object, and the datafile object which allows us to get a new datafile object to check the size of and perform the next shrink operation.
Here is a sample command to run the above script when it is saved as ShrinkFile.ps1.
.\ShrinkFile.ps1 -SQLInstanceName 'v2012' ` -Database 'MyBigDB' ` -FileGroupName 'Primary' ` -FileName 'MyBigDB' ` -ShrinkSizeMB 48000 ` -ShrinkIncrementMB 20
Next Steps
- Defragment datafile after shrinking
- Deal with Large Objects for shrinking file
- Set SQL Server Recovery Model to Simple See: SQL Server Recovery Models shrink operations are logged, switching to SIMPLE mode makes it go MUCH FASTER
About the author
This author pledges the content of this article is based on professional experience and not AI generated.
View all my tips