PowerShell AWS Tools for Fast File Copy

By:   |   Comments (4)   |   Related: > Amazon AWS


Problem

Amazon has delivered good tools to work with their services and AWSPowershell is a good way to work with these serivces. The AWS Tools for Windows PowerShell and AWS Tools for PowerShell Core are PowerShell modules that are built on the functionality exposed by the AWS SDK for .NET. The AWS PowerShell Tools enable you to script operations on your AWS resources from the PowerShell command line.

One issue we are facing is when you need to send big files from a local disk to AWS S3 bucket upload files in the console browser; this can be very slow, can consume much more resources from your machine than expected and take days to finish. Also, the browser doesn't provide any kind of control in the files.  How can we address this item with PowerShell?

Solution

In this first part, I will focus to show how to work with AWS S3 bucket and to solve slowness to upload files to your bucket. First, download and install the AWS SDK using the link https://aws.amazon.com/powershell/.

The installation process is quite simple and you can install all options like I did or install only the AWS Tools for Windows PowerShell.

windows setup
custom setup

After installation you will see in the start button a guide for AWS Tools and PowerShell for AWS.

recently added

The first option is a guide to how to use the tool installed.

visual studio

Windows PowerShell for AWS has the AWSPowershell module imported. To know what cmdlets are available, execute:

Get-Command -Module AWSPowershell | Where-Object Name -like *S3*

In this case I want only the S3 cmdlets.

windows powershell

Also, you can use PowerShell ISE to work with the cmdlets available. Running the cmdlets below to have it available.

Install-Package -Name AWSPowerShell
 
Import-Module AWSPowershell

With AWSPowershell we can manage services such as EC2, CloudWatch, IAM, SNS, SQS and so on. Regarding S3 you can create and delete Amazon S3 buckets, upload files to an Amazon S3 bucket as objects, delete objects from an Amazon S3 bucket and much more.

To connect to your AWS account you need the access and secret key to the user account for the session.

Initialize-AWSDefaultConfiguration -AccessKey AKIAIOSFODNN7EXAMPLE -SecretKey wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY -Region us-west-2

That cmdlet saves the specified credential keys and default region selection to a profile named 'default'. The credentials and region are set as active in the current shell.

Let's create variables to set the common information and create a simple script which will verify if the bucket exists and verify if the file exists, if the file doesn't exist it will upload the file to the bucket.

$bucket = 'your bucket name'
$source = 'your local path'
$AKey   = 'AKIAIOSFODNN7EXAMPLE'
$SKey   = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
$region = 'us-east-2'

Initialize-AWSDefaultConfiguration -AccessKey $AKey -SecretKey $SKey -Region $region
 
Set-Location $source
$files = Get-ChildItem '*.bak' | Select-Object -Property Name
try {
   if(Test-S3Bucket -BucketName $bucket) {
      foreach($file in $files) {
         if(!(Get-S3Object -BucketName $bucket -Key $file.Name)) { ## verify if exist
            Write-Host "Copying file : $file "
            Write-S3Object -BucketName $bucket -File $file.Name -Key $file.Name -CannedACLName private
         } 
      }
   } Else {
      Write-Host "The bucket $bucket does not exist."
   }
} catch {
   Write-Host "Error uploading file $file"
}

The PowerShell ISE will show the uploading message.

copying file

You can see errors if you don't configure your default configuration properly. For example:

category info

This error is because it was trying to run with the wrong region value.

Another error could be for example like:

bucket name

In this case the parameter -File when run the cmdlet Write-S3Object was wrong but below the error message you can see another with a different information, in this case was "The bucket you are attempting to access must be addressed using the specified endpoint..."

The first line shows the "right" error "The file indicated by the FilePath property does not exist"

Conclusion

This is the first part using PowerShell to work with AWS Services. I did some tests trying to upload via the console browser and took a day to upload 100GB. Using AWSPowershell it took a few hours to upload 14 files with over 160GB in total size in the same network not using S3 boosts.

Also, you could see I created a simple script, to verify if the file exist before upload it and using PowerShell you have much more control with how to send the files, you can change the order, log if there are errors, schedule what time you want to send the files to S3, control if your file exist you can verify the modified date and update or not your file.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Douglas Correa Douglas Correa is a database professional, focused on tuning, high-availability and infrastructure.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Tuesday, November 24, 2020 - 6:11:59 PM - Raj Back To Top (87839)
Hello,

while executing the script I got below error. Could please help me to resolve this issue?

Error uploading file @{Name=xxxxxxdb01_XXX_DEVELOPMENT_LOG_20200904_150002.trn}.

I am uploading full backup files as well as log backup files in sam bucket but differeent folders using below script.


$bucket = 'your bucket name'
$source = 'your local path'
$exts = "*.bak","*.trn"
$AKey = 'AKIAIOSFODNN7EXAMPLE'
$SKey = 'wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY'
$region = 'us-east-2'

Initialize-AWSDefaultConfiguration -AccessKey $AKey -SecretKey $SKey -Region $region

Set-Location $source
$files = Get-ChildItem $source -Include $exts -recurse -force | Select-Object -Property Name
try {
if(Test-S3Bucket -BucketName $bucket) {
foreach($file in $files) {
if(!(Get-S3Object -BucketName 'your bucket name' -Key $file.Name -like "*.bak")) { ## verify if exist
Write-Host "Copying file : $file "
Write-S3Object -BucketName 'your bucket name/full' -File $file.Name -Key $file.Name -CannedACLName private
else
Write-S3Object -BucketName 'your bucket name/log' -File $file.Name -Key $file.Name -CannedACLName private
}
}
} Else {
Write-Host "The bucket $bucket does not exist."
}
} catch {
Write-Host "Error uploading file $file"
}

Thanks,
Raj K.

Thursday, July 23, 2020 - 4:28:28 AM - vipul Back To Top (86178)

Hi

I tried to upload file but below error.

Invalid URI: The hostname could not be parsed

What is issue?

Thanks


Wednesday, August 29, 2018 - 4:27:24 PM - Natalia Back To Top (77340)

 This is exactly what I needed!! Thank you very much, very very helpful!


Friday, August 10, 2018 - 3:58:33 PM - jeff_yao Back To Top (77124)

Thanks for the tip @Douglas.

With more companies are moving some or all their IT resources to cloud, such type of tips are really more welcome these days.

 















get free sql tips
agree to terms