Learn more about SQL Server tools

mssqltips logo
 

Tutorials          DBA          Dev          BI          Career          Categories          Webcasts          Scripts          Today's Tip          Join

Tutorials      DBA      Dev      BI      Categories      Webcasts

DBA    Dev    BI    Categories

 

Using Files Stored in Azure File Services with Integration Services - Part 1


By:   |   Last Updated: 2019-06-07   |   Comments   |   Related Tips: More > Azure

Problem

I have a comprehensive ETL solution developed with SQL Server Integration Services (SSIS). It can read and write data from files stored on a local network share. I want to migrate this solution to the cloud. I can migrate my packages to the Azure-SSIS IR environment in Azure Data Factory, but what about my on-premises network share?

Solution

Azure Data Factory supports a lift-and-shift scenario where existing SSIS projects can easily be migrated to the Azure cloud environment. In Azure, the SSIS package are executed in the Azure-SSIS Integration Runtime. But there are other dependencies which need to be migrated to the cloud as well. If your SSIS packages read files from a network drive, you can either give the Azure-SSIS IR access using a VNET, or you can migrate your whole set-up to Azure File Services or another service where you can store your files, such as Azure Blob Storage. In this tip, we'll discuss Azure File Services.

The main benefit of Azure File Services (AFS) is that you can very easily migrate your existing SSIS packages, because AFS also supports UNC paths and the SMB protocol. For example, an SSIS package reads a flat file located at the following path:

\\myserver\myfolder\myflatfile.txt

In AFS, you'll have a similar path, such as:

\\mystorageaccount.file.core.windows.net\mystoragecontainer\myfolder\myflatfile.txt

which makes it only a question of changing a parameter value to change the location of the flat file. For more information about the Azure-SSIS IR, you can read the following tips:

Azure File Services and SSIS

Setting-up Azure File Services

Log into the Azure Portal. If you don't yet have a storage account, go to Add a resource and search for storage account.

add storage account in Azure File Services

In the next screen, you need to specify the configuration of the storage account, such as the subscription, resource group, the location etc.

Create storage account in Azure File Services

Important options are the account kind (general purpose v2 or blob storage are recommended. Since we're going to use files, storageV2 is the best option) and the access tier. Hot storage is useful for frequently accessed data, while cool storage is optimized for infrequent access. Both have different cost mechanisms, so choose the one best suited to your situation. In this tip, I choose cool storage since the files are going to be used in an ETL process, which runs only once a day.

Once the storage account is created, you can go to the File Services section:

file services

The next step is to create a file share by clicking at the +-icon at the top:

add file share

All you have to specify is a name and the quota, which specifies the maximum amount of GiB the share can handle. This property can be adjusted later on.

create file share

Click on the newly created share to open it.

open share

In the share, you have many different options:

  • Connect. This will create a mapped drive on your computer.
  • Upload, Add Directory, Refresh and Delete Share are self-explanatory.
  • With the Quota option, you can change the number of GiB we set earlier.
  • You also have the option to create a snapshot of the entire file share with Create Snapshot. With View snapshots you can enlist them in the portal. There's no option to specify a name though. Snapshots are taken incrementally and can be used as a back-up mechanism. Keep in mind they use storage as well, which needs to be paid for. You can find more information about snapshots in the article Announcing the general availability of Azure Files share snapshot.
view snapshot

When using Azure Storage Explorer, you have the same options as in the portal:

azure storage explorer

The Explorer also has the option to show the statistics of a folder, which is the number of subdirectories and the number of bytes it contains:

statistics for file share

If you want to create a snapshot, you need to use the context menu:

create snapshot in Azure Storage Explorer

Connecting to Azure File Services

Azure File Services has been configured, but how can we access it from our own machine? In Windows, we need to add the credentials of the file share into the Credential Manager. First, we need to find the storage account key. In the portal, go to your storage account and select Access Keys from the menu:

get access keys

Here you can find the name of your storage account and two account keys. There are two keys available, so you can use one while you refresh the other. Copy the name and one of the account keys to the clipboard.

account keys

Open Credential Manager and go to Windows Credentials. There, choose to add a Windows credential.

add a credential

You need to specify the following information:

  • Network address. This is the name of your storage account, followed by .file.core.windows.net.
  • The user name, which is AZURE\ followed by the name of the storage account.
  • As the password you need to specify the account key.
specify windows credential

Once the credential is added, you can simply browse to your file share as if it was locally on your network! Keep in mind the root folder is your storage account and the name of the file share, so browsing to \\mssqltips.file.core.windows.net will not work.

brows to file share

Another option to add the credential is to use the command line with the following command:

cmdkey /add:mystorageaccount.file.core.windows.net /user:AZURE\mystorageaccount /pass:<YourStorageAccountKey>

If you create SSIS packages, they can access the file share from within Visual Studio, since they will use the credential you just specified. However, when the SSIS package is stored inside the SSIS catalog in Azure, it needs access to AFS as well when the package is executed. This means we need to add the credential to the SSIS catalog as well. This can be done with the stored procedure catalog.set_execution_credential. When logged into the SSISDB database, execute the following script:

catalog.set_execution_credential @domain= N'Azure', @user= N'mystorageaccount', @password= N'storagekey'

In the next part of the tip, we'll integrate Azure File Services into an SSIS package.

Next Steps


Last Updated: 2019-06-07


get scripts

next tip button



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a BI professional, specializing in the Microsoft BI stack with a particular love for SSIS.

View all my tips
Related Resources




Post a comment or let the author know this tip helped.

All comments are reviewed, so stay on subject or we may delete your comment. Note: your email address is not published. Required fields are marked with an asterisk (*).

*Name    *Email    Email me updates 


Signup for our newsletter
 I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.



    



Learn more about SQL Server tools