Learn more about SQL Server tools

mssqltips logo
 

Tutorials          DBA          Dev          BI          Career          Categories          Webcasts          Whitepapers          Today's Tip          Join

Tutorials      DBA      Dev      BI      Categories      Webcasts

DBA    Dev    BI    Categories

 

Automate Retrieving SQL Azure Bacpacs With PowerShell


By:   |   Last Updated: 2015-05-20   |   Comments (1)   |   Related Tips: More > Azure

Problem

We would like to store backups of Azure databases locally, for both local testing as well as possible disaster recovery. Can we use the SMO library with Azure-SQL and is it possible to automate this with PowerShell?

Solution

Yes, we can use the SMO library in Azure-SQL and loop through databases similar to how we loop through them in SQL Server. The below script will loop through all databases, except master, on an Azure Server and export their bacpac files to a local path. According to Microsoft, bacpac files contain both the schema and data of the database (though see the limitations and restrictions on the linked page - their suggestion to copy the database, then export the data and schema) and they are useful for migrating databases, for instance, copying an Azure-SQL database to a local SQL Server environment.

Function Get-Bacpacs {
    Param(
    [string]$location
    , [string]$server
    , [string]$smolibrary
    , [string]$daclibrary
    , [string]$username
    , [string]$password
    )
    Process
    {
        $dt = Get-Date -uFormat "%Y%m%d"
        
        Add-Type -Path $smolibrary
        $scon = "Data Source=$server.database.windows.net;Initial Catalog=master;User ID=$username;Password=$password;"
        $servercon = New-Object Microsoft.SqlServer.Management.Common.ServerConnection
        $servercon.ConnectionString = $scon
        $srv = New-Object Microsoft.SqlServer.Management.SMO.Server($servercon)
        
		foreach ($db in $srv.Databases | Where-Object {$_.Name -ne "master"})
        {
            $database = $db.Name
            $bak_scon = "Data Source=$server.database.windows.net;Initial Catalog=$database;Connection Timeout=0;User ID=$username;Password=$password;"
            
            if (!(Test-Path $location))
            {
                New-Item $location -ItemType Directory
            }
            
            $outfile = $location + $database + "_" + $dt + ".bacpac"
            Add-Type -Path $daclibrary
            $d_exbac = New-Object Microsoft.SqlServer.Dac.DacServices $bak_scon
            
			try
            {
                $d_exbac.ExportBacpac($outfile, $database)
            }
            catch
            {
                Write-Warning $_
                ###  Other alerting can go here
            }
        }
    }
}

###  This makes it easier for us who don't have really long screens!  Location may vary.
$smo = "C:\Program Files (x86)\Microsoft SQL Server\110\SDK\Assemblies\Microsoft.SqlServer.Smo.dll"
$dac = "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"

Get-Bacpacs -location "" -server "" -smolibrary $smo -daclibrary $dac -username "" -password ""

The script uses the standard looping mechanism for every database on Azure (excluding master); the DacServices class, from the Dac.dll library, includes a method we use here called ExportBacpac that we can access by passing in the database name and the file where we'll be exporting the bacpac. If the directory for where you're storing the bacpac files doesn't exist, the script will create it. Depending on your alerting, you may want to add alerting outside of outputting a message, such as sending an email or text.

For environments with multiple servers, and for security relating to usernames and passwords, I would recommend keeping the values used in the above script (outside of the libraries) stored in a configuration table. The table would at least hold the server name, location (if different), and the username and password. See this tip for using a configuration table approach.

Finally, if the intent is to test code in SQL Server by bringing an Azure-SQL database into a local environment, make sure that any new code written is compatible with Azure-SQL (unless the intent is that it will become a local environment). Testing in both carries advantages for hybrid environments (or environments that may migrate at some point), but if the intent is one or the other, the testing should be verified in the final environment. For instance, a hedge fund may use Azure-SQL initially, but decide after applying new algorithms, to store the data locally - in this case, we would get the bacpac for the local environment, without worrying about code compatibility, as long as the intent is not to return to Azure. On the other hand, if the hedge fund wants a local environment "online in case of emergency", testing should ensure code works in both environments.

The below code is a simple import, which can import each backpac (one at a time, though a developer could use Get-ChildItem and filter by *.bacpac and restore multiple files):

Function Import-BacBac {
    Param(
    [string]$server
    , [string]$daclibrary
    , [string]$bak_file
    , [string]$name
    )
    Process
    {
        Add-Type -Path $daclibrary
        $scon = "Data Source=$server;Initial Catalog=master;Connection Timeout=0;Integrated Security=true;"
       
        $d_impbac = New-Object Microsoft.SqlServer.Dac.DacServices $scon
        $d_loadbac = [Microsoft.SqlServer.Dac.BacPackage]::Load($bak_file)
        $d_impbac.ImportBacpac($d_loadbac, $name)
    }
}

###  Location may differ
$dac = "C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\Microsoft.SqlServer.Dac.dll"
 
Import-BacBac -server "" -daclibrary $dac -bak_file "" -name ""

We can also import these bacpacs into another Azure-SQL subscription for instance, if I want to weekly clone my production Azure-SQL databases to a development subscription.  I can export the bacpacs and import them (note that the connection string on the import would require the credentials for the Azure-SQL subscription).

Next Steps
  • Identify the location of your SMO and DAC libraries.
  • Test a database backup from an Azure database.
  • For multiple databases, test using a configuration table when using the above script.


Last Updated: 2015-05-20


next webcast button


next tip button



About the author
MSSQLTips author Tim Smith Tim Smith works as a DBA and developer and also teaches Automating ETL on Udemy.

View all my tips
Related Resources





Post a comment or let the author know this tip helped.

All comments are reviewed, so stay on subject or we may delete your comment. Note: your email address is not published. Required fields are marked with an asterisk (*).

*Name    *Email    Email me updates 


Signup for our newsletter
 I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.



    



Tuesday, July 19, 2016 - 10:33:39 AM - tuczap Back To Top

 

Script for exporting bacpac from Azure SQL DB works fine, except it is not exporting bacpacs ;) It created the folder under -location and I added some debug messages so I can see that the script is getting into try - catch block but the bacpacs are just not getting created. No error messages.


Learn more about SQL Server tools