Zip and Copy SQL Server Backup Files with PowerShell

By:   |   Comments (5)   |   Related: > Backup


We store our daily backups in a backup folder and keep these for four days before removing them. After a few recent disasters in our area, we've attached two shares that we are going to use to store backups for 60 days and we'd like to store these as zip files before we move them to these shares. We'd like to automate this task daily after our backup job where we get all of the recent backups, zip them, and copy the zip files to the two shares. How can we do this with PowerShell?


In this tip, we’ll look at automating a zipping protocol with PowerShell where we zip a database backup, copy the zipped backup to two folder locations (or shares), and remove the original. We’ll add a remove function in case we want to eventually remove zipped backups on the two folder locations.

Validate SQL Server Database and Backup Database

Our first step will ensure that we have a valid backup file, so we’ll run an integrity check before backing up the database. Once the integrity check passes, we’ll backup the database and verify that the file is valid. In some environments, we will not be able to run an integrity check before backing up the database due to limitations. In these environments, one alternative practice is to backup the database, restore the backup file on a different server, run an integrity check and alert if there’s a failure, then drop the database. Integrity checks should be run as often as we are unable to lose any data – the more important our data are, the more often we need to validate the integrity of our data. I’ve seen situations where integrity checks were skipped and the company discovered months of files with corrupt database backups.

---- Verify integrity

---- Set date dynamic string
DECLARE @savestring VARCHAR(250) = 'I:\Backups\Data2017_' + CONVERT(VARCHAR(12),GETDATE(),112) +'.BAK'

---- Backup
TO DISK = @savestring

---- Verify backup file
FROM DISK = @savestring

The backup format we use here only serves for demonstration purposes, as we may want to include options with our backup, such as encryption, compression, etc. We also want a regular schedule where we restore backups, to validate times and dates of when we checked their validity while retaining this information. In the least as a best practice, every backup database should also verify the file is good right after the file has been backed up, which we see in the final step of the above T-SQL.

Zip SQL Server Database Backup Files

In the above T-SQL, we save the date and time information with the format of four years, then 2 months, then 2 days. First, we want to mirror this format with the zip file. PowerShell will follow this format using the $date.ToString() method and entering the "yyyyMMdd" format in the parentheses. We’ll use this to keep the zip files formatted in a similar naming convention to the backup file or files. We can see an example of this by calling the below script in a PowerShell ISE window or PowerShell command line:

to string

In the first script, the design assumes that we have backups from multiple days in the same folder or location. Since we have multiple backup files in our path and we want the recent wants to copy on to file shares, we will create a file store to copy recent backup files to in order to zip. The second script will skip this step and assume that all our backups are stored by a folder date eliminating the need to copy recent backup files to a holding folder before compressing.

The zip library in .NET that we’ll be using will be the System.IO.Compression.FileSystem class, which has the method CreateFromDirectory requiring two folder paths – one folder path that will be zipped and the other path the actual zip file where it will be saved. Because we are copying all recent backups to the file store, we do not want to keep these after they’ve been zipped and copied to shares, so this folder will be removed with all contents after the zip and copy.

### Script 1 
Function ZipCopy-BAKs {
        , [Parameter(Mandatory=$true)][string]$store
        , [Parameter(Mandatory=$true)][string]$share1
        , [Parameter(Mandatory=$false)][string]$share2
        $zip = (Get-Date).ToString("yyyyMMdd") + ".zip"

        if (Test-Path $store) { Remove-Item $store -Recurse -Force }
        New-Item -ItemType Directory -Path $store

        $allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-12) }

        foreach ($bakfile in $allbakfiles)
            Copy-Item -Path $bakfile.FullName -Destination $store

        Add-Type -AssemblyName System.IO.Compression.FileSystem
        if ($share2) { [System.IO.Compression.ZipFile]::CreateFromDirectory($store,"$share2$zip") }

        Remove-Item $store -Recurse -Force

ZipCopy-BAKs -filepath "I:\backups\" -store "I:\backups\store\" -share1 "H:\backups\bakshare\" -share2 "G:\backups\bakshare\" 
### Script 2
Function ZipCopy-BAKs {
        , [Parameter(Mandatory=$true)][string]$share1
        , [Parameter(Mandatory=$false)][string]$share2
        $zip = (Get-Date).ToString("yyyyMMdd") + ".zip"

        Add-Type -AssemblyName System.IO.Compression.FileSystem
        if ($share2) { [System.IO.Compression.ZipFile]::CreateFromDirectory($filepath,"$share2$zip") }

ZipCopy-BAKs -filepath "I:\backups\" -share1 " H:\backups\bakshare\" -share2 " G:\backups\bakshare\" 

In the first script, we filter the creation date of the files to ensure that we only get the recent backup files. If this zip and copy step follows a backup job (in a SQL Server Job Agent as an example - see the image below this), this would ensure that we only get the backups from today. If you need to extend the time, or shorten it, we can add or subtract the hours based on what's compatible with our environment:

SQL Server Job Agent example
$allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-12) }

For an example, if we wanted to get all files in the last six hours, we would just change this line to:

$allbakfiles = Get-ChildItem $filepath -Filter *.Bak | Where-Object { $_.CreationTime -gt (Get-Date).AddHours(-6) }

As long as this step follows the backup step, however we choose to run this step (such as Task Scheduler, Job Agent, etc), the backups will complete before this step runs. If we do not run it following our backup step, such as running our backups in SQL Server Job Agent, but running our Zip and Copy in Task Scheduler, we will want to make sure the time window covers a possible backup that runs longer than expected, such as using a time window of up to 23 hours.

Next Steps
  • Organize your backup locations by folder with everything, specific name of the database, by specific time, or by a combination of these. Either of the above scripts will function in most environments, provided that the database or databases are within a folder.
  • While this tip covers ways in which we can have multiple copies of compressed backups, we want to still follow strong backup best practices in T-SQL by always checking integrity on our database and validating our backup file after the backup. Our environment may require other options specified during the backup process such as encrypting backups, compressing them with T-SQL, adding checksums, etc.
  • PowerShell scripts can be executed within SQL Server Agents or as Windows Scheduled Tasks.

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Tim Smith Tim Smith works as a DBA and developer and also teaches Automating ETL on Udemy.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Monday, May 21, 2018 - 10:42:52 AM - Tim Back To Top (75992)

While this tip covers ways in which we can have multiple copies of compressed backups, we want to still follow strong backup best practices in T-SQL by always checking integrity on our database and validating our backup file after the backup. Our environment may require other options specified during the backup process such as encrypting backups, compressing them with T-SQL, adding checksums, etc.

You should always follow the best practices in relationship to your environment, which takes into account the version of the tool(s) you use.  This inherently means somethings will and will not apply to you.

Monday, May 21, 2018 - 8:17:05 AM - Del Back To Top (75990)

Since backup compression has been built into SQL Server since the 2008 version, it is difficult to understand why someone would still be using some other compression.  In fact, (long ago) we used to use a solution like this when we were running SQL 2005, but we changed when we moved to 2008.  Aside from simplifying the compression step, the backups (and restores - for those who regularly do that to a test environment) go much faster.

Monday, May 21, 2018 - 7:04:36 AM - Dave Boltman Back To Top (75989)

SQL Server already has the option to compress backups (in some editions), using WITH COMPRESSION in the BACKUP statement.

However, in my experience the compression provided by SQL Server is not very good at all. 7-zip provides much much better compression, and I'd recommend zipping database backups with that.

Monday, May 21, 2018 - 6:29:46 AM - Ruwan P Back To Top (75988)

 good article. Thanks some very good tips


Monday, May 21, 2018 - 6:26:28 AM - Rajasekhar Reddy Back To Top (75987)

I dont think so, you will save much space after compressing when you already take the backup with compression.

get free sql tips
agree to terms