How to import data from Azure Blob Storage to a local file with SQL Server Integration Services

By:   |   Comments (5)   |   Related: > Azure


Problem

I have a csv file in Azure Blob Storage and I want to import to my local SQL Server. Is that possible?  What tools do I need to import the data?

Solution

Yes, in this new tip we will use the SQL Server Integration Services (SSIS) task named Azure Blob Source, which is installed with the Feature Pack for SSIS to connect to Azure.

We are assuming that you already have an Azure Storage Account and a Blob container. If you do not, please refer to this tip (Automatically upload SQL Server Database Backups to Azure with Integration Services) and follow steps 1 through 6. In that tip, our Azure Storage Account was mssqltips and the Blob container is bc1.

Requirements

  1. SQL Server 2016.
  2. The SSDT installed.
  3. An Azure Subscription.
  4. Internet Access.
  5. Microsoft Azure Storage Explorer installed (MASE).
  6. Feature Pack in SSIS for Azure installed.
  7. An Azure Storage Account a Blob Container created (see our tip Automatically upload SQL Server Database Backups to Azure with Integration Services).

Getting Started

  1. Create a csv file name file.csv with the following content:

  2. Name;Lastname;email
    John ;Lennon;[email protected]
    Ringo ;Starr;[email protected]
    Paul;Maccartney;[email protected]
    George;Harrison;[email protected]

  3. Using MASE, upload the file.csv of step 1 to the Azure Blob container bc1 created in the earlier tip:


  4. Microsoft Azure Storage Explorer

  5. Using SSDT, create a SSIS project. Drag and drop a Data Flow Task to the design pane:


  6. Drag a Data Flow Task in the SQL Server Data Tools

  7. Double click Data Flow. Drag and drop Azure Blob Source. This is a new task included in the SSIS Feature Pack for Azure installed as part of the prerequisites. This task will be used to connect to the csv file in Azure:


  8. Drag an Azure Blob Source into the Data Flow

  9. Drag and drop a Flat File Destination and join this task with the Azure Blob Source:


  10. Drag a Flat File Destination into the Data Flow

  11. Double click the Azure Blob Source and press New to create a new Azure storage connection:


  12. Azure Blob Source Editor

  13. To connect, you need a Storage account and an Account key:


  14. Azure Storage Connection Manager Editor

  15. In your browser, go to Azure Portal and then to All resources > mssqltips (or your Storage Account if you used another name) > Access Keys:


  16. In the Azure Portal copy your Access Keys

  17. Copy the Storage account name used and the key1:


  18. Copy the Storage Account Name and the Key1 value

  19. Paste the Storage account name and the key1 of the step 9 in the Azure Storage Connection manager in the Window of step 7 in SSIS and Test the connection:


  20. Paste the Storage account name and the key1 in the Azure Storage Connection Manager Editor

  21. The Blob container name is the name of a Blob Container created in Azure using MASE. In that container is where the file .csv is located. The Blob name is the name of the file. We will use ";" as the column delimiter because this csv file is separated by semicolons as you can see in step 1:


  22. In the Azure Blob Source Editor enter the Connection Manager, Container Name and Blob Name

  23. Click Columns to check if the columns were recognized:


  24. Validate the columns are recognized in the Azure Blob Source Editor

  25. Save the properties and go to Flat File Destination Task:


  26. Flat File Destination Editor

  27. Press New to create a new connection:


  28. Setup a new connection in the Flat File Destination Editor

  29. Select Delimited Flat File Format:


  30. Specify the Flat File Format

  31. Press Browse to specify the destination file name and path:


  32. Browse for the file name in the Flat File Destination Editor

  33. Specify the path and name of the file:


  34. Specify the path and file name

  35. In this example, the destination file will be exported.txt:


  36. Specify the Properties for the Flat File Destination Editor

  37. Press Mappings to map source columns with destination columns:


  38. Click on the Mappings option in the Flat File Destination Editor

  39. By default, the destination columns will be the same, but you can change them here if you want:


  40. Map the columns in the Flat File Destination Editor

  41. Start the package to copy the Azure information from the csv file:


  42. Start the SSIS Package in SSDT

  43. If everything is OK, a new file will be created in the path specified in step 18:


  44. Review the directory where the flat file was created

  45. Open the txt file created:


  46. Review the data for the flat file

Conclusion

In this tip, we copied the data from a csv file stored in an Azure Storage Account to a local txt file using SSIS. For this, an Azure storage account and a container were needed. We used the new feature pack for SSIS that includes new tasks for Azure.

Next Steps

For more information, refer to the following links:



sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Daniel Calbimonte Daniel Calbimonte is a Microsoft SQL Server MVP, Microsoft Certified Trainer and 6-time Microsoft Certified IT Professional. Daniel started his career in 2001 and has worked with SQL Server 6.0 to 2022. Daniel is a DBA as well as specializes in Business Intelligence (SSIS, SSAS, SSRS) technologies.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Wednesday, September 16, 2020 - 10:00:01 AM - Meenakshi Back To Top (86492)
Hi ,
I have configured same as mentioned but i am getting "[Azure Blob Source] Error: The remote server returned an error: (400) Bad Request." while executing the steps.Kindly let me know If any steps is missing.

Friday, April 24, 2020 - 5:40:40 AM - Adithi Back To Top (85463)

Is there a way to dynamically pass the blobname too? In my case i do not want it to be file.csv but file_country.csv where country is passed dynamically.


Thursday, January 31, 2019 - 12:58:52 PM - Krutika Back To Top (78928)

Inside container i have folders and folders have multiple CSV files
How can loop through that and copy


Tuesday, October 10, 2017 - 1:46:43 PM - Vivekkumar Aruchamy Back To Top (67164)

Can we do the same programmatically in c#?  

 

Do you have any sample for this?


Monday, October 2, 2017 - 8:59:02 PM - Rohan Back To Top (66816)

 How to get a list for file names (e.g. pdf) that are stored in blob on text file or sql ?

 















get free sql tips
agree to terms