Steps to rollback database changes without impacting SQL Server Log Shipping

By:   |   Comments (2)   |   Related: > Log Shipping


I have a production release for a database that has log shipping setup.  The database is very large and I don't want to have to redo the entire log shipping setup again if I have a problem with the release and I need to roll back the changes.  I plan to take a full backup of the database prior to the changes, but how can I easily roll back and not have this impact log shipping if I need to go that route?


It is quite common to create a full backup of the database before a major database release. As part of the release, we may be given some scripts to run on the database or configure some new functionality. However, in case we run into issues during the release and if the need arises to restore the database and if the database is a log shipping primary database then we may have some tough tasks at hand. It is important for us to rollback successfully and ensure the system is brought back to the state it was prior to the change. There are some simple steps which we can follow to ensure that we don't have to reconfigure log shipping all over again thereby saving time and ensuring systems are not affected. Before we get started, this tip assumes log shipping is already configured on your server.

Steps to follow

1. Connect to the primary server and run the Log Shipping Backup job for the primary database to create the log backup.

2. Connect to the secondary server and run the Log Shipping Copy and Restore jobs, so the primary and secondary databases are in sync.

3. At this point, both the primary and secondary databases are in sync and the transaction log shipping status could be obtained by right clicking on the instance name and go to Reports-> Standard Reports-> Transaction Log Shipping Status

4. Take a full backup of the primary log shipped database.

5. Once the full backup has completed successfully, repeat steps 1, 2, and 3 and then disable the Log Shipping jobs on both servers, so they don't run again. That is, disable the LSBackup job on the primary, LSCopy and LSRestore jobs on the secondary. Take note of the last transaction file that was restored on the log shipping secondary database. You can use the script below to get this information.

--Run this on secondary server
where secondary_database='Enter your Secondary DB Name'

6. Proceed with the database change implementation. At this point, you may need to watch the log file utilization on the primary database. If you are concurrently running scripts, you may want to check the log file utilization with commands like below.


If there is a big increase in log file utilization on the primary database and you want to keep this in check, you may want to enable the LSBackup job on the primary (step 1) and keep note of the log file backups that get generated. This would be useful especially if the release fails and you need to restore the primary database with the backup taken in step (4). You can use the below script to note the last log file that is generated.

--Run this on primary server
select primary_server,primary_database,last_backup_file,last_backup_date
where primary_database='Enter your primary DB Name'

Scenario 1: database changes are successful and no roll back needed

In this case, you just need to resume the log shipping jobs on both servers and everything should be fine. That is, enable the LSBackup job on primary, LSCopy and LSRestore jobs on secondary. Run LSBackup job followed by LSCopy job and finally LSRestore job. The log shipping should be in sync again which could be verified from step (3).

Scenario 2: database changes are unsuccessful and need to rollback

You would need to perform the below steps to restore the database back to the state before the changes.

CAUTION: If you had to enable the LSBackup job (Step 6) on the primary due to database log file utilization and if the job created log file backups, it is important that you delete all of these log file backups that were generated (during that time) from the log shipping folder. Otherwise, these log file backups would get copied over to the secondary server when the LSCopy job is run and it would impact log shipping causing it to fail when the LSRestore job runs. You can get the actual log file backup location using the script in step (6). Refer to the column last_backup_file which has the actual folder location on the primary server.

1. Restore the primary database using the full backup from step (4). Use this sample script below.

--Run this on primary server
restore database GiveDBName
from disk ='Give Backup file location'
with replace,stats,norecovery

--This should be the full backup

2. After restoring the primary database with the full backup, the transaction log backup taken in step (5) needs to be restored. Use this sample script below.

--Run this on primary server
log GiveDBName
disk='T-Log Backup file Location'
with stats,recovery

After these two steps, the primary database should be in sync with the secondary database and now resume the log shipping jobs on both servers again.  Enable the LSBackup job on the primary, LSCopy and LSRestore jobs on the secondary and run them in order. The jobs should succeed and we can check the status of log shipping using step (3).

Next Steps

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Mohammed Moinudheen Mohammed Moinudheen is a SQL Server DBA with over 6 years experience managing production databases for a few Fortune 500 companies.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Thursday, November 21, 2013 - 1:50:12 AM - Chhavi Back To Top (27563)

Thanks a bunch Mohammed for this post. Good one.

Monday, October 14, 2013 - 12:54:51 PM - Srinath Back To Top (27145)

That's an excellent post, Moinu, as always !! Keep writing. Thank you.

get free sql tips
agree to terms