Backing up your SQL Server to Amazon's Cloud in Three Easy Steps

By:   |   Comments (25)   |   Related: > Backup


Problem

My organization is beginning to invest in Infrastructure as a Service (IaaS), one of the things we are interested in doing is migrating SQL Server backups to the cloud. How can I do this safely, securely, and affordably?

Solution

Amazon's Simple Storage Solution (S3) is an inexpensive and effective way to move your backups off-site with minimal effort. In three easy steps you can have your SQL Server backed up to Amazon's redundant infrastructure, for less than a nickel per gigabyte per month. This is a 300-level tip, I expect that the reader has had some experience with AWS, PowerShell, SQL Server Maintenance Plans and SQL Server Agent Jobs. If you're not that advanced read on anyway and I'll point you to some great references for the details I don't cover explicitly.

Step 1: Set up Amazon S3 Storage

In his article "Introduction to the Amazon Relational Database Service" Tim Cullen provided detailed instructions on how to create an Amazon AWS account. If you don't already have an account take a few minutes to set one up. Everything in this tip can be accomplished using the available free tier if you want to follow along. Even my source database is SQL Server 2012 running on a t2.micro EC2 instance.

Once you have set up your Amazon AWS account you will need to create a user and a key pair. These are used by AWS PowerShell Toolkit to access and manipulate objects in AWS. Starting at the AWS console click on the Identity and Access Management (IAM) icon to get to the IAM dashboard.

Set up Amazon S3 Storage

On the menu to the left choose "Users" and on the next page click the blue "create new users" button. Enter a username in the first box, make sure the box labeled "Generate an access key for each user" is checked, then click "create" on the bottom left.

Generate an access key for each user

You'll be presented with this warning.

Click the "download credentials" button

Click the "download credentials" button at the bottom of the page and save the credentials.csv file somewhere safe, you'll need it later when we set up PowerShell for AWS. Once you've downloaded the credentials file you can click on "close" to return to the IAM dashboard and you will see your new user listed. Click on the orange box in the upper right to return to the AWS control panel.

Next we will create an S3 "bucket" in which to upload our backup files. Click on the S3 icon on the AWS control panel.


Click on the S3 icon on the AWS control panel.

At the S3 console click the "Create Bucket" button on the upper left to get the create bucket dialog.


Click the "Create Bucket" button

Give your bucket a unique name, the namespace for S3 buckets is global, must follow DNS naming rules and be all lower case (for more information on bucket naming click the link on the dialog box or go to https://docs.aws.amazon.com/AmazonS3/latest/UG/CreatingaBucket.html). Don't worry about setting up logging, we won't use it in this tip, just click "create". This will return you to the S3 console.

Click on the "Properties" button in the upper right of the page. For this tip we are going to set the retention policy of the bucket to 30 days, then delete. You can set the retention to as long as you like depending on how much you want to pay for the storage. You can also choose to have the aged files transferred to Amazon Glacier which is less expensive storage, but has the disadvantage of a very slow (glacial) restore time and more expensive data retrieval rate. Click on the "Lifecycle" band to expand it and then click the "add rule" button. This brings up the lifecycle rules dialog. For step one choose to apply the rule to the whole bucket then click the "Configure Rule" button to go on to the next step.

Click on the "Properties" button in the upper right of the page.

This step is where we configure the retention period and the disposition of the files when they expire. We will choose "Permanently Delete Only" from the "Action on Objects" drop-down. There are also options to do nothing, archive to Glacier only, or archive to glacier and delete. Once you've chosen an option you can specify the number of days after the object's creation that the action will take place. You can click the "See an example" link for a graphical explanation of what your setting will do.

Choose "Permanently Delete Only" from the "Action on Objects" drop-down.

Click the "Review" button to finish configuring and naming the lifecycle rule. Give it a descriptive name, then click "Create and Activate Rule". You have the chance to make any adjustments here and you can always go back and change the lifecycle rule later.

Click the "Review" button to finish configuring and naming the lifecycle rule

Back at the S3 console you can now see your new rule in the Lifecycle section.

Now we have created an S3 bucket for our backup files and applied a retention policy. We are ready to download and configure the tools we need to move our backups to the cloud.

Step 2: Install and configure the Amazon PowerShell module.

Amazon has a very complete API which can be used with various languages as well as a command line interface. They have also created a PowerShell toolkit that will let you do just about everything you can do from the AWS console and in some cases even more. The SDK and PowerShell tool kit are bundled together and downloadable from Amazon at http://aws.amazon.com/powershell. Click on the "AWS Tools for Windows" button to download the installer. Running the installer lets you choose which components you want to install, the full SDK, just PowerShell, sample code and a toolkit for Visual Studio. Since I'm using a throw-away EC2 instance I just installed the default, which is everything.

Install and configure the Amazon PowerShell module.

Whether you're using PowerShell, the various APIs or the command line interface every call to AWS must be authenticated using the credentials we created and downloaded in step one. Amazon's solution for PowerShell is to maintain your credentials in a secure, local store and load them into memory as needed so they do not have to be hard-coded into your scripts. The file is located in %LOCALAPPDATA%\AWSToolKit\RegisteredAccounts.json and the keys are obfuscated- either hashed or encrypted. For a complete description of Amazon's best practice for key management you can refer to their documentation at http://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html.

We're going to run the PowerShell part of our Backup Maintenance plan using a separate account that only has the rights it needs to run the backups, copy the files to S3, and to delete the old files. On the server create a new account called "BackupUser" with no special privileges, then give it full rights only on the backup directory.

We're going to run the PowerShell part of our Backup Maintenance plan

In the Database setup a Windows Login and give it the sysadmin role in order to be able to create and run the Maintenance Plan.

Now we will load the credentials we created into a local secure store. First we need to log into the server with the BackupUser account since it will be running the PowerShell. Once we've logged in, open up the credentials.csv file we downloaded previously into a text editor or spreadsheet. To make your credentials available to your scripts use the Set-AWSCredentials cmdlet as shown below.

Set up a Windows Login and give it the sysadmin role

Set-AWSCredentials -AccessKey = AKIAJ2G7LB7Z6EXAMPLE -SecretKey = tOcCPhuPufirLNpydXeU4MWeC7V4YhExampleKey -StoreAs default

Complete information on adding and removing credentials and credential profiles is in the Amazon documentation at http://docs.aws.amazon.com/powershell/latest/userguide/specifying-your-aws-credentials.html. You can verify that the credentials are available for use with the Get-AWSCredentials cmdlet with the -ListStoredCredentials argument and it will return the names of all the profiles registered in the persistent store.

PS C:\> Get-AWSCredentials -ListStoredCredentials
default

The Set-AWSCredentials cmdlet writes your keys to the local store for the OS account under which it is running, you will need to run this command for every account that will use the AWS PowerShell toolkit.

Step 3: Set up your Maintenance plan using SQL Native backups and a simple PowerShell script

For this tip we will build a maintenance plan to backup the databases and transaction logs, move the backups to S3 using PowerShell, and clean up the local backup copies.

Since the SQL Server Agent account does not have sufficient privileges on the server to do that, we will set up a proxy to run the SQL Server Agent Jobs that are a part of the plan. First, create a credential for the BackupUser account:

USE [master]
GO
CREATE CREDENTIAL [SQLBackupUser] WITH IDENTITY = N'DOMAINNAME\BackupUser', SECRET = N'verysecurepasswordhere'
GO

Then create a proxy for the account configured to run CmdExe (subsystem_id 3).

USE [msdb]
GO
EXEC msdb.dbo.sp_add_proxy @proxy_name=N'SQLBackupUser', @credential_name=N'SQLBackupUser', @enabled=1
GO
EXEC msdb.dbo.sp_grant_proxy_to_subsystem @proxy_name=N'SQLBackupUser', @subsystem_id=3
GO

We cannot execute our PowerShell directly from a Maintenance Plan. We need to wrap it in a SQL Server Agent Job first. Further, because of the way the agent sets up the environment for the job we will need to wrap the PowerShell in a batch program where we explicitly set %USERPROFILE% otherwise it will be set to c:\Users\Default instead of c:\Users\BackupUser and PowerShell will not be able to find the AWS credentials we stored in the previous step. The batch file will also accept a single command line parameter that will be passed directly to the PowerShell to tell it whether it is running for full or transaction log backups.

@echo off
echo ================= BEGIN ================
echo USERNAME ....... %USERNAME%
echo USERPROFILE .... %USERPROFILE%
set USERPROFILE=C:\Users\%USERNAME%
echo NEW USERPROFILE. %USERPROFILE%
echo ================= PS ===================
PowerShell c:\AWS\BackupToS3.ps1 %1
echo ================= END ==================

Now lets set up the job, then we'll delve into the PowerShell that will do the heavy lifting to get the backups moved to S3.

Set up your Maintenance plan using SQL Native backups and a simple PowerShell script.

And the step.

Delve into the PowerShell

The step will run the batch file with the code from above and pass either "*.bak" for a full backup copy to S3 or "*.trn" to copy a transaction log backup. Note that the job step has to be run as the Proxy we created earlier so it will have rights to the files in the backup directories.

The step will run the batch file with the code.

That's all for the full backup job, now create a second job just like it for the transaction logs except pass "*.trn" as a parameter to the command instead of "*.bak"

The maintenance plan will be very bare-bones, it will have a sub plan each for full backups and transaction log backups which will be essentially the same except for the schedule and the files they are backing up. Create a new Maintenance Plan called "Database Backup" and start the first sub-plan.

First create the Back Up Database task. In this tip we will back up all the databases and compress the backups, since we are going to pay for storage by the gigabyte we want to keep our bill as low as possible. We will put all the backups in the same directory with subdirectories for each database, this is where the PowerShell will look for them when it's time to copy to S3.

First create the Back Up Database task.

In SQL Server Management Studio if we want to run an external command from a maintenance plan we need to use the "execute SQL Server Agent Job" task and tell it to run the first job we just defined (the "*.bak" version).

In SQL Server Management Studio if we want to run an external command from a maintenance plan we need to use the "execute SQL Server agent job" task

Finally, we will clean up after ourselves. We will leave the three days most recent backups on the local disk in case we need them, the rest will already be in S3 for us to copy back for use.

The rest will already be in S3 for us to copy back for use.

Now, link the steps together, add a schedule and the first sub-plan is complete. In this example the full backup is scheduled daily.

Now, link the steps together.

Now create the second subplan for the transaction log backups, noting that in the Execute SQL Server Agent Job Task to specify "*.trn" as the argument to the batch file. Also set the schedule to something more frequent, for this tip we will use four hours.

We're ready to write the PowerShell to move the backups to S3. The first step is to make sure the AWS PowerShell Toolkit is included. Depending on your version of PowerShell this may happen automatically, but well include the code just in case.

#
# SQL Server Backup to S3
#

# Get the AWS Stuff
Import-Module -Name AWSPowerShell

Next, we'll check that the parameter passed on the command line is okay. We only know two kinds of backups, anything else will cause the script to terminate

# Should be "*.bak" or "*.trn"
if ( $args[0] -eq "*.bak" -or $args[0] -eq "*.trn" ) { $backuptype = $args[0] } else { exit 1 }

Here we load in the credentials from the persistent store we created. This will only work if the script is running as the same user who created the store and the %USERPROFILE% environmental variable is set. The "ProfileName" must also match what was saved, if you look back you will see we specified "default" when we saved the credentials.

# Get credentials from the persisted store
Initialize-AWSDefaults -ProfileName default -Region us-east-1

# Go to base backup location
Set-Location s:\mssql\bkup

# Loop thru the subdirectories for each database
Get-ChildItem | Foreach-Object {
    if ($_.PSIsContainer) {

        # Set the prefix for the S3 key
        $keyPrefix = "sql-server-backups/" + $_.name

        # Switch to database subdirectory
        Set-Location $_.name;

At this point we've moved into the backup directory and are looping into each database subdirectory. We set the $keyPrefix variable to the name of the folder we created in S3 plus the name of the databases subfolder. Next we get the name of the newest backup file of the type specified for this run. This script is run immediately after the backup task in the Maintenance plan so we want to grab the file that was just created

        # Get the newest file in the list
        $backupName = Get-ChildItem $backuptype | Sort-Object -Property LastAccessTime | Select-Object -Last 1

S3 is not a file system, it stores objects as a key/value pair. For all intents and purposes it looks like a file system, but it is important to know the difference. The file is saved as an object with a key, then retrieved using the same key (which looks very much like a filename and directory, but isn't really).

        # build the S3 Key
        $s3Keyname = $keyPrefix + "/" + $backupName.Name

Finally we write the file out to S3. We specify the bucket name to be used, the key (or filename) to be used in S3, the local filename to be uploaded, and the encryption to be used. In this tip were using server side encryption so the file will be encrypted at rest by Amazon using their keys. There is a client side encryption option as well which would require us to set up our own key management system, but give us more control over who has access to the files. With server side encryption the files are encrypted, but anyone with credentials to access the S3 bucket can still retrieve an unencrypted copy.

        # Copy the file out to Amazon S3 storage
        Write-S3Object -BucketName mssqltips-demo-bucket-01 -Key $s3keyname -File $backupName -ServerSideEncryption AES256

        # Go back the the base backup location
        Set-Location s:\mssql\bkup
    }
}

To retrieve your files from S3 use the command

Read-S3Object -BucketName "BUCKETNAME" -Key "S3KEYNAME" -File "LOCALFILENAME"

And there you have it, a complete system for backing up your SQL Server databases to the cloud in three easy steps!

Summary

In Step One we started by making an Amazon Web Services account if we did not already have one. We added a user to the account and generated credentials to be used to access our S3 bucket. We created and configured our S3 bucket with a folder for backups and a retention period of 30 days.

In Step Two we installed the PowerShell toolkit and created a local credentials store.

In Step Three we created a Maintenance Plan to back up the databases, run some custom PowerShell to copy the backups to S3, and keep our local backup directory cleaned up.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author John Grover John Grover is a 25-year IT veteran with experience in SQL Server, Oracle, clustering and virtualization.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Thursday, May 14, 2020 - 1:20:10 PM - Jason L. Back To Top (85661)

 I found some code that allows the EC2 role to be assumed in PowerShell script, and if I hard code the variable for the extension in a test script and run it works. But running as the SQL job fails with a "could not find file specified".  I believe there is a permission issue here but can't get any more error information to troubleshoot. Any ideas? Code below:

 

#

# SQL Server Backup to S3

#

# Get the AWS Stuff

Import-Module -Name AWSPowerShell

$YourRoleArnHere = "arn:aws:iam::<accountnumber>:role/<rolename>"

# This is the URI for the folder one level above the credentials we need.

$MetadataUri = `

    "http://169.254.169.254/latest/meta-data/iam/security-credentials"

 

# We need to get the contents of the folder to know the name of the subfolder.

# Technically there could be multiple but I haven't seen that happen. We will

# just use the first one.

$CredentialsList = ( `

    Invoke-WebRequest -uri $MetadataUri `

).Content.Split()

 

# Get the credentials and turn the JSON text into an object.

$CredentialsObject = (Invoke-WebRequest `

    -uri "$MetadataUri/$($CredentialsList[0])" `

).Content | ConvertFrom-Json

 

# Create/update a profile using the temporary access key and secret key

# we retrieved from the metadata.

Set-AWSCredential `

    -StoreAs InstanceProfile `

    -AccessKey $CredentialsObject.AccessKeyId `

    -SecretKey $CredentialsObject.SecretAccessKey `

    -SessionToken $CredentialsObject.Token 

 

# Create/update the assume role profile.

Set-AWSCredential `

    -StoreAs default `

    -RoleArn $YourRoleArnHere `

    -SourceProfile InstanceProfile

# Should be "*.bak" or "*.trn"

if ( $args[0] -eq "*.bak" -or $args[0] -eq "*.trn" ) { $backuptype = $args[0] } else { exit 1 }

# Get credentials from the persisted store

Initialize-AWSDefaults -ProfileName InstanceProfile -Region us-east-1

 

# Go to base backup location

#Set-Location <\\SERVER\PATH>

 

# Loop thru the files

Get-ChildItem $backuptype|Sort-Object -Property LastAccessTime| Select-Object -Last 1| Foreach-Object {

    

        # Set the prefix for the S3 key

        $keyPrefix = "sql-server-backups/" + $backuptype.Substring(2) +"/" + $_.name

        # Copy the file out to Amazon S3 storage

        Write-S3Object -BucketName <bucketname> -Key $keyPrefix -File $_.name -ServerSideEncryption AES256


Thursday, May 14, 2020 - 9:19:29 AM - Jason L. Back To Top (85653)

I have this working, but would prefer to use an EC2 role to authenticate to S3.  I commented our the Initialize-AWSDefaults statement and get Access Denied (the EC2 role is specified in the bucket policy). Do you know how I can tell this PowerShell to use the EC2 role instead?


Saturday, August 11, 2018 - 11:50:15 AM - John Grover Back To Top (77138)

The $backuptype is from $arg[0] and indicates a full or transaction log backup, the example does not deal with differential backups. The expected content is "*.bak" or "*.trn" as indicated in the second PS snippet.

I hope that answers your question!


Friday, August 10, 2018 - 5:29:16 PM - SQLDB Back To Top (77126)

 Hi John,

Thanks for the Article. I have a question at the step get the newest backup files. 

        # Get the newest file in the list
        $backupName = Get-ChildItem $backuptype | Sort-Object -Property LastAccessTime | Select-Object -Last 1

What is the $backuptype variable here?
when i use this command i am only getting latest file from entire directory. I would like to get all the latest files form sub directories under root.

Thanks,


Tuesday, January 23, 2018 - 11:18:56 AM - Adam Back To Top (75021)

 

I am the accidental DBA for our small business and found this article very helpful.  Other articles on offsite database backup were a little over my head.  Though I tweaked some things to fit my situation, this is just what I needed to get started.  Nicely written and thorough.

Thanks for sharing your knowledge!


Thursday, October 12, 2017 - 8:02:31 AM - Alin Back To Top (67235)

 Hi John.

Thank you for taking your time and replying.

We have a backup process that performs the backups locally. We have the details of the location and filename of the most recent backups not yet sync'd with S3 and the stored procedure is collecting all these details and dynamically builds the PS script requried to sync them with S3. Once the full script is built, it calls an XP_CMDSHELL and runs the whole script (in the form of "powershell.exe -Command <individual_Write-S3Object_commands>". All errors are then trapped and stored within the database for reporting / alerting purposes. All successfully sync'd backups are marked as such and then purged when retention threshold is met. All not-sync'd are not deleted from the local storage until we review them and manually trigger the sync.

The stored procedure is called from a larger job that is used to perform the backups (custom process, not maintenance plans).

SQL is SQL 2008R2 running on Windows 10 (small shop).

Hope this helps by giving more context to the problem.

Regards,

Alin

 

 


Wednesday, October 11, 2017 - 7:54:06 PM - John Grover Back To Top (67220)

Alin,

Is there a specific need to run the command from a stored procedure vs. the SQL Server Agent? I would generally discourage use of xp_cmdshell unless you absoutely need it (your definition of "absolute need" may vary). As an alternative you could try creating an Agent Job and running it using the sp_start_job stored procedure in msdb documented at https://docs.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-start-job-transact-sql

If you've not restarted SQL Server since installing the AWS Tools it may not be able to find them. There may also be issues with the SQL Server service account being able to access the modules due to permissions- either of these would result in the same error if you try to run your xp_cmdshell command directly in a SSMS query window.

If this does not help, knowing your SQL Server version, edidition, and Windows OS version would be helpful diagnostic information.


Wednesday, October 11, 2017 - 4:05:19 PM - Alin Back To Top (67214)

Hi John

 

Not sure where to turn to, so here's my question. I was using this approach on an AWS-hosted instance and is working nicely.

 

I tried to implement a similar approach on an on-premise instance and I seeing the following behavior:

1. if I manually run the Write-S3Object cmdlet with all the required parameters, it works and copies whatever I need to, no matter how I run it (inside a batch file, a manual command inside PS command window, manually as POWERSHELL.EXE -Command ...

2. If I run it inside a stored procedure using xp_cmd "powershell.exe -Command Write-S3Object ..." it fails with the following error message:

 Write-S3Object : The term 'Write-S3Object' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.At line:1 char:1+ Write-S3Object -BucketName my-bucket-name-here -File H:\Backups\backup_file.bak + ~~~~~~~~~~~~~~ + CategoryInfo : ObjectNotFound: (Write-S3Object:String) [], Comm andNotFoundException + FullyQualifiedErrorId : CommandNotFoundException

The path variable includes the location of AWS tools, but lists it last. AWS tools have been installed using the MSI and as I already mentioned, if I try it manually, it works.

I am out of ideas and also not very savvy with PS and AWS.

Any pointers or suggestions, very much appreciated.

 


Tuesday, March 28, 2017 - 4:57:51 PM - Jebi Nivens Back To Top (53828)

 While I chose not to use a maintenance plan, I found this solution useful when it came to creating credentials on S3 and using the access keys to get access in S3.  Thank you.  

 


Saturday, March 18, 2017 - 8:09:48 PM - John Grover Back To Top (51400)

 

Peter,

It appears that your SQL Server was shut down or restarted whle the backup was still executing. "The job was stopped prior to completion by Shutdown Sequence 0" You can verify this in the SQL Server or Windows logs. If you have Cloud Watch configured those logs may also be helpful.

 


Thursday, March 16, 2017 - 4:49:39 PM - Peter Back To Top (51257)

 Hi John.

 My "S3 Archive Full Backup" Job fails with the following error logs:

03/16/2017 16:37:08,S3 Archive Full Backup,Cancelled,,WIN-63LBGVJD5BM\TESTSQLS3,S3 Archive Full Backup,,,The job was stopped prior to completion by Shutdown Sequence 0.  The Job was invoked by User WIN-63LBGVJD5BM\backupuser.  The last step to run was step 1 (Copy full backups to S3).,00:01:54,0,0,,,,0

03/16/2017 16:37:08,S3 Archive Full Backup,Cancelled,1,WIN-63LBGVJD5BM\TESTSQLS3,S3 Archive Full Backup,Copy full backups to S3,,Executed as user: WIN-63LBGVJD5BM\backupuser. The step was cancelled (stopped) as the result of a stop job request.,00:01:54,0,0,,,,0

 

Before this, I got an error as follows:

 

03/16/2017 15:59:05,S3 Archive Full Backup,Success,1,WIN-63LBGVJD5BM\TESTSQLS3,S3 Archive Full Backup,Copy full backups to S3,,Executed as user: WIN-59LUAVJD8AR\backupuser. ================= BEGIN ================  USERNAME ....... backupuser  USERPROFILE .... C:\Users\backupuser.WIN-63LBGVJD5BM NEW USERPROFILE. C:\Users\backupuser  BACKUPTYPE .... "*.bak"  ================= PS ===================c:\AWS\BackupToS3.ps1 : File C:\AWS\BackupToS3.ps1 cannot be loaded because   running scripts is disabled on this system. For more information<c/> see   about_Execution_Policies at http://go.microsoft.com/fwlink/?LinkID=135170.  At line:1 char:1  + c:\AWS\BackupToS3.ps1 *.bak  + ~~~~~~~~~~~~~~~~~~~~~      + CategoryInfo          : SecurityError: (:) []<c/> PSSecurityException      + FullyQualifiedErrorId : UnauthorizedAccess  ================= END ==================.  Process Exit Code 0.  The step succeeded.,00:00:01,0,0,,,,0

 

This was resolved after changing execution policy to unrestricted. PLease advise on the first error above.


Wednesday, March 8, 2017 - 9:37:08 PM - John Grover Back To Top (47618)

There's the issue. BackupUser is not SQLSERVERAGENT. Try creating a proxy for BackupUser to execute the Maint Plan.


Wednesday, March 8, 2017 - 6:18:45 PM - Andre Back To Top (47608)

 

Fixed it. You should mention that both scripts shoud also be owned and accessible from BackupUser. After I changed the Folder Permission where the scripts where located I worked. Thank you. You can delete my comments before. 


Wednesday, March 8, 2017 - 5:55:39 PM - Andre Back To Top (47607)

 Thank you John.

I saved my credential via PowerShell (not PowerShell for AWS) in the profil named "default".

I logged in with the BackupUser and wait until the maintance plan is trigged automatically. When I check the log I always find the following:

================= BEGIN ================
USERNAME ....... SQLSERVERAGENT
USERPROFILE .... C:\Users\SQLSERVERAGENT
NEW USERPROFILE. C:\Users\SQLSERVERAGENT
================= PS ===================
Initialize-AWSDefaults : Unable to load stored credentials for profile =
[default],
At C:\AWS\BackupToS3.ps1:9 char:1
+ Initialize-AWSDefaults -ProfileName default -Region eu-west-1
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (:) [Initialize-AWSDefaults], Argu
   mentException
    + FullyQualifiedErrorId : System.ArgumentException,Amazon.PowerShell.Commo
   n.InitializeDefaultsCmdlet
 
Write-S3Object : No credentials specified or obtained from persisted/shell
defaults.
At C:\AWS\BackupToS3.ps1:26 char:9
+         Write-S3Object -BucketName X-X -Key $s3keyname -File
$backu ...
+        
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : InvalidOperation: (Amazon.PowerShe...eS3ObjectCm
   dlet:WriteS3ObjectCmdlet) [Write-S3Object], InvalidOperationException
    + FullyQualifiedErrorId : InvalidOperationException,Amazon.PowerShell.Cmdl
   ets.S3.WriteS3ObjectCmdlet
 
Write-S3Object : Cannot bind argument to parameter 'File' because it is null.
At C:\AWS\BackupToS3.ps1:26 char:75
+         Write-S3Object -BucketName X-X -Key $s3keyname -File
$backu ...
+                                                                          
~~~~~~
    + CategoryInfo          : InvalidData: (:) [Write-S3Object], ParameterBind
   ingValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,A
   mazon.PowerShell.Cmdlets.S3.WriteS3ObjectCmdlet
 
================= END ==================

 


Wednesday, March 8, 2017 - 5:20:11 PM - John Grover Back To Top (47604)

Andre-

Make sure you have stored your credentials in a profile using Set-AWSCredentials, then verify they are stored by using Get-AWSCredentials -ListStoredCredentials. The output of Get-AWSCredentials will give you the name(s) of the AWS credential profiles stored for your Windows account. You can also look at the file that contains the profiles in %LOCALAPPDATA%\AWSToolKit\RegisteredAccounts.json

Note that profiles are per Windows account, if you create the profile as "Andre" and try to access them as "SQLServer" it will not work.

The Amazon article at http://docs.aws.amazon.com/powershell/latest/userguide/specifying-your-aws-credentials.html gives very good directions on how to manage your credentials.

 

Wednesday, March 8, 2017 - 3:22:35 PM - Andre Back To Top (47601)

 

Hello, I always get the following error: Initialize-AWSDefaults : Unable to load stored credentials for profile =   [default]

I followed your guide step by step and also deleted the profile a couple of time.

Any idea?

 


Friday, May 20, 2016 - 10:59:49 AM - LS Back To Top (41525)

 Great article. I did follow step 1 and step 2, but for 3rd step I used the command only to copy the .bak file. I did not need to step up the maintenance plan. Can you also include an additional step to restore from S3 to RDS or EC2, that would be helpful! Thanks again. 

 


Thursday, December 31, 2015 - 7:48:10 AM - John Grover Back To Top (40338)

Emile,

You are absolutely correct, as described in this tip scale would be a problem. The point of the article is to describe the basic steps needed to copy backups to S3. The target audience is really the "accidental" or small-shop DBA who wants a simple way to get an off-site copy of their backups. It can also be a starting point for an experienced DBA to put together a system that would work in their environment. 

There are several vendors who offer a SQL Server to S3 backup solutions, Cloudberry comes to mind.

Thanks for your comment!

John G


Tuesday, December 29, 2015 - 8:12:38 PM - Emile van der Donk Back To Top (40327)

 

 

With all respect, thia method looks rather silly to me. It might work for the 500MB tennis club database, but for any serious database environment, the only model that scales is a delta based backup model where only the diff is transferred to sync the remote backup file with the original. Any other solution every night transfers gigabytes of the same data. 

We have approx 300GB of SQL database backup files but the delta file is between 200MB and 600MB a day. Without delta files, we'd be sending over 300GB per day, with a delta in place this is just 

What strikes me is that this is so trivial yet no vendor seems to come up with a proper solution. Or am  missing something?


Tuesday, November 10, 2015 - 1:55:32 PM - Erik Back To Top (39047)

Hi

I made a tool that helps to upload the backups to Amazon S3, Glacier, Azure and Dropbox. It's free.

The url is www.sqlserverboooster.com

Regards,
Erik 


Thursday, March 12, 2015 - 8:39:22 AM - Sufian Back To Top (36505)

This is  file copy .not database backup on S3. 

 

There are other ways to directly do the backup directly to S3.. Mapping the Drives or URL access


Sunday, January 11, 2015 - 1:17:27 AM - Hubi Vedder Back To Top (35894)

The native support for SQL Server back-up into Microsoft Azure makes it a far proposition than AWS. Probably cheaper too if you have an existing agreement with Microsoft. 

http://msdn.microsoft.com/en-us/library/jj919148.aspx

 

Hubi


Monday, December 29, 2014 - 3:57:22 PM - John Grover Back To Top (35783)

Sean-

You certainly don't need to use a Maint Plan to use this mechanism, but is was the easiest way to illustreate the process. As far as the definition of a backup- well, that can be a philosophical argument but I'd call a copy of the data a backup. The point is to show what is required to get a recoverable copy of your databases into the cloud as part of a 3,2,1 backup scheme.


Monday, December 29, 2014 - 10:24:10 AM - Greg Robidoux Back To Top (35780)

Lot's of SQL Server environments still use Maintenance Plans.  I agree Maintenance Plans are not the best choice, but it's better than nothing.  Also, not all SQL Servers are managed by DBAs, so people use what is available.

-Greg


Monday, December 29, 2014 - 9:16:04 AM - Sean McCown Back To Top (35779)

This is not backing up to S3, it's backing up locally and copying it over.

Also, no DBA with any self respect would use a Maint Plan.

 















get free sql tips
agree to terms