Using Files Stored in Azure File Services with Integration Services - Part 1

By:   |   Comments (8)   |   Related: > Azure Data Factory


I have a comprehensive ETL solution developed with SQL Server Integration Services (SSIS). It can read and write data from files stored on a local network share. I want to migrate this solution to the cloud. I can migrate my packages to the Azure-SSIS IR environment in Azure Data Factory, but what about my on-premises network share?


Azure Data Factory supports a lift-and-shift scenario where existing SSIS projects can easily be migrated to the Azure cloud environment. In Azure, the SSIS package are executed in the Azure-SSIS Integration Runtime. But there are other dependencies which need to be migrated to the cloud as well. If your SSIS packages read files from a network drive, you can either give the Azure-SSIS IR access using a VNET, or you can migrate your whole set-up to Azure File Services or another service where you can store your files, such as Azure Blob Storage. In this tip, we'll discuss Azure File Services.

The main benefit of Azure File Services (AFS) is that you can very easily migrate your existing SSIS packages, because AFS also supports UNC paths and the SMB protocol. For example, an SSIS package reads a flat file located at the following path:


In AFS, you'll have a similar path, such as:


which makes it only a question of changing a parameter value to change the location of the flat file. For more information about the Azure-SSIS IR, you can read the following tips:

Azure File Services and SSIS

Setting-up Azure File Services

Log into the Azure Portal. If you don't yet have a storage account, go to Add a resource and search for storage account.

add storage account in Azure File Services

In the next screen, you need to specify the configuration of the storage account, such as the subscription, resource group, the location etc.

Create storage account in Azure File Services

Important options are the account kind (general purpose v2 or blob storage are recommended. Since we're going to use files, storageV2 is the best option) and the access tier. Hot storage is useful for frequently accessed data, while cool storage is optimized for infrequent access. Both have different cost mechanisms, so choose the one best suited to your situation. In this tip, I choose cool storage since the files are going to be used in an ETL process, which runs only once a day.

Once the storage account is created, you can go to the File Services section:

file services

The next step is to create a file share by clicking at the +-icon at the top:

add file share

All you have to specify is a name and the quota, which specifies the maximum amount of GiB the share can handle. This property can be adjusted later on.

create file share

Click on the newly created share to open it.

open share

In the share, you have many different options:

  • Connect. This will create a mapped drive on your computer.
  • Upload, Add Directory, Refresh and Delete Share are self-explanatory.
  • With the Quota option, you can change the number of GiB we set earlier.
  • You also have the option to create a snapshot of the entire file share with Create Snapshot. With View snapshots you can enlist them in the portal. There's no option to specify a name though. Snapshots are taken incrementally and can be used as a back-up mechanism. Keep in mind they use storage as well, which needs to be paid for. You can find more information about snapshots in the article Announcing the general availability of Azure Files share snapshot.
view snapshot

When using Azure Storage Explorer, you have the same options as in the portal:

azure storage explorer

The Explorer also has the option to show the statistics of a folder, which is the number of subdirectories and the number of bytes it contains:

statistics for file share

If you want to create a snapshot, you need to use the context menu:

create snapshot in Azure Storage Explorer

Connecting to Azure File Services

Azure File Services has been configured, but how can we access it from our own machine? In Windows, we need to add the credentials of the file share into the Credential Manager. First, we need to find the storage account key. In the portal, go to your storage account and select Access Keys from the menu:

get access keys

Here you can find the name of your storage account and two account keys. There are two keys available, so you can use one while you refresh the other. Copy the name and one of the account keys to the clipboard.

account keys

Open Credential Manager and go to Windows Credentials. There, choose to add a Windows credential.

add a credential

You need to specify the following information:

  • Network address. This is the name of your storage account, followed by
  • The user name, which is AZURE\ followed by the name of the storage account.
  • As the password you need to specify the account key.
specify windows credential

Once the credential is added, you can simply browse to your file share as if it was locally on your network! Keep in mind the root folder is your storage account and the name of the file share, so browsing to \\ will not work.

brows to file share

Another option to add the credential is to use the command line with the following command:

cmdkey / /user:AZURE\mystorageaccount /pass:<YourStorageAccountKey>

If you create SSIS packages, they can access the file share from within Visual Studio, since they will use the credential you just specified. However, when the SSIS package is stored inside the SSIS catalog in Azure, it needs access to AFS as well when the package is executed. This means we need to add the credential to the SSIS catalog as well. This can be done with the stored procedure catalog.set_execution_credential. When logged into the SSISDB database, execute the following script:

catalog.set_execution_credential @domain= N'Azure', @user= N'mystorageaccount', @password= N'storagekey'

In the next part of the tip, we'll integrate Azure File Services into an SSIS package.

Next Steps

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Tuesday, June 8, 2021 - 1:55:42 AM - Koen Verbeeck Back To Top (88815)
Hi Marlo,

do you mean the Azure-SSIS IR? (running your SSIS packages in ADF?) Because that should work. Make sure you've set the execution credentials in the SSISDB catalog.

Monday, June 7, 2021 - 11:21:16 AM - Marlo Back To Top (88810)
The File Share doesn't work with Azure IR. You get an error. I think this is an only option for on-premise.

Wednesday, April 14, 2021 - 9:16:31 AM - Koen Verbeeck Back To Top (88534)
Hi Rob,

this stored procedure is normally only found in SSIS catalogs that support an Azure-SSIS IR, so either in Azure SQL DB or Azure Managed Instance. Is this the case? If you're talking about your local SSISDB, you need to use windows credential manager to set the credentials.

Tuesday, April 13, 2021 - 6:33:11 PM - Rob Back To Top (88526)
the stored procedure catalog.set_execution_credential is missing from my SSISDB. Do I have to install something, perhaps ADF, to get it?

Wednesday, March 17, 2021 - 7:39:05 AM - Hello Back To Top (88425)
Thanks very much Koen, yes that was the problem and I just missed that last line on your very helpful article. Sorted! thanks again.

Tuesday, March 16, 2021 - 5:37:55 AM - Koen Verbeeck Back To Top (88413)

did you give permissions to the SSIS catalog using the catalog.set_execution_credential stored proc?


Sunday, March 14, 2021 - 10:49:53 PM - Hello Back To Top (88393)
Hi, thanks very much for such an informative and helpful article.

I followed everything I could and even able to see file data via flat file connection via mapped drive. When I execute the package from VS it's all ok but when I execute it via ADF or SSISDB I keep getting this message,

"SsisDBOperationGuid": "acc0264b-2f5e-4c32-b15f-6a3e4ef86bd7",
"SsisDBOperationId": 15,
"OperationErrorMessages": "3/15/2021 2:31:30 AM +00:00 : adf:Error: The file name \"\\\\\\fileshare\\customer.txt\" specified in the connection was not valid.\r\n\n3/15/2021 2:31:33 AM +00:00 : adf:Error: The file name \"\\\\\\fileshare\\customer.txt\" specified in the connection was not valid.\r\n\n3/15/2021 2:31:35 AM +00:00 : adf:Error: The file name \"\\\\\\fileshare\\customer.txt\" specified in the connection was not valid.\r\n\n",
"effectiveIntegrationRuntime": "sqlserver (East US)",
"executionDuration": 21,
"durationInQueue": {
"integrationRuntimeQueue": 1

any thoughts?

Many Thanks

Friday, September 20, 2019 - 1:41:04 PM - Brian Kofford Back To Top (82523)

Where does the catalog.set_execution_credential stored procedure come from? My SSISDB does not include it, even if I delete and recreate the catalog.

get free sql tips
agree to terms