Create an Azure Function to execute SQL on a Snowflake Database - Part 2

By:   |   Comments (4)   |   Related: > Azure Functions


Problem

Azure Data Factory currently doesn't have an integrated connector for the Snowflake cloud data warehouse. A work-around is to use Azure Functions to send SQL statements to Snowflake. In this two-part tip, we explain how you can create and use such an Azure Function.

Solution

In part 1 of the tip we explained the basic concepts and created an Azure Function in Visual Studio. This function accepts a SQL statement through the body of an HTTP request and then sends this SQL statement to the Snowflake database. The connection string is retrieved from a configuration file. It's advised to go through part 1 if you want to follow along with this tip.

For completeness, here's the code for the Azure Function:

using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.Extensions.Configuration;
using System.Collections.Generic;
 
namespace RunSnowflakeQuery
{
    public static class SnowflakeExecuteQuery
    {
        [FunctionName("SnowflakeExecuteQuery")]
        public static async Task<IActionResult> Run(
            [HttpTrigger(AuthorizationLevel.Function,"get", "post", Route = null)] HttpRequest req,
            ILogger log,
            ExecutionContext context)
        {
            var config = new ConfigurationBuilder()
                 .SetBasePath(context.FunctionAppDirectory)
                 .AddJsonFile("local.settings.json", optional: true, reloadOnChange: true)
                 .AddEnvironmentVariables()
                 .Build();
            string myConnectionString = config["SQLConnectionString"];
            var client = new SnowflakeClient(myConnectionString);
            string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
            dynamic command = JsonConvert.DeserializeObject(requestBody);
            var statements = new List<string>();
            statements.Add(command.Query.ToString());
            int affectedRows;
 
            try
            {
                affectedRows = client.ExecuteNonQuery(statements.ToArray());
            }
            catch (Exception e)
            {
                log.LogError(e, "Error processing request");
                // return req.CreateResponse(HttpStatusCode.InternalServerError, e.Message);
                return new BadRequestObjectResult(e.Message);
            }
 
            // return req.CreateResponse(HttpStatusCode.OK);
            return new OkObjectResult(new { rowcount = affectedRows });
        }
    }
}			

The code uses an extra class called SnowflakeClient, which can also be found on the Github page of Jess Pani.

Publishing the Azure Function

Now our function is complete, we want to push it to the Azure Portal where we can easily test it. Publishing is quite straight forward: right-click the project and select Publish…

publish azure function to azure

The next screen asks us for the intended target.

  • You have Azure Functions Consumption or Premium Plans. This is serverless compute, which scales dynamically and will run the code on-demand. The Premium plan is of course faster and better. You can find more info in the pricing guide.
  • There is also Azure App Service Plan, which is dedicated compute for running code on-demand. More info in the pricing guide.
  • Or you can deploy to a folder.
publishing target

In this example, the defaults were chosen as indicated in the screenshot above. Selecting Run from package file will run the Function from a .zip file containing all the source files. More information can be found in the documentation and in this announcement.

In the next screen, you create your Azure App Service (for the difference between the service and the plan, check this blog post). As mentioned in part 1, the name of the Visual Studio project is the same as the App Service.

create app service

After creating the App Service, we have arrived in the Publishing screen.

publish azure function using zip deploy

Before you hit Publish, click on Edit Azure App Service settings. This will allow you to create your configuration for the Snowflake connection string.

edit application settings

After clicking Publish, we can verify in the output window that publishing went smoothly:

output window

In the Azure Portal, we can now find the newly created App Service (called RunSnowflakeQuery) and it's only function SnowflakeExecuteQuery. In the App Service, you can go to the Function app settings.

app service

In the settings, we can manage the applications settings, so we can verify our connection string was deployed correctly.

manage application settings

As you can see in the screenshot below, the connection string is present. It's encrypted, but you can verify it in the browser. It will be passed as an environment variable when the function is running.

connection string environment variable

Testing the Azure Function

After all this work, it would be nice to know if the function actually works. It's possible to debug the function in Visual Studio, but since we're using a POST request, it's a bit easier in the Azure Portal.

First of all, we're going to create a dummy table in Snowflake and populate it with one single row:

create sample data in snowflake

Using our function, we're going to send a TRUNCATE TABLE statement to empty the table. In the Azure Portal, go to the App Service and select the function SnowflakeExecuteQuery. At the right hand side of the screen, you can find the Test functionality.

function in Azure Portal

In the Test pane, we can select the HTTP method which is POST.

Enter test parameters

In the Request body, we can enter the following JSON:

{
    "Query":"TRUNCATE TABLE DBO.FunctionTest;"
}			

At the bottom of the pane, click Run to start the test. If everything went successful, you'll see an output body containing the row count, which should be zero since TRUNCATE TABLE doesn't return rows. You'll also see the status message "200 OK".

test truncate successful

In Snowflake, we can verify the table is now empty.

empty table

Let's try with a different statement.

{"Query":"INSERT INTO DBO.FunctionTest(myColumn) 
VALUES('Hello MSSQLTips!');"}			

And the result:

insert new record

Let's verify in Snowflake:

test successful, row inserted

We've demonstrated how we can successfully send SQL statements over to the Snowflake cloud data warehouse using the HTTP trigger of an Azure Function. In a next tip, we'll integrate this solution into Azure Data Factory so we can use it as an orchestrator for the ETL process loading the data warehouse.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Wednesday, May 19, 2021 - 1:40:05 PM - Koen Verbeeck Back To Top (88714)
Here you go:

https://www.mssqltips.com/sqlservertip/6208/integrate-azure-function-into-azure-data-factory-pipeline/

Tuesday, May 18, 2021 - 9:25:36 PM - Nanda Kandukuri Back To Top (88702)
Hi Koen, I am not able to find the tip for how you integrate this solution into Azure Data Factory so we can use it as an orchestrator for the ETL process loading the data warehouse. I see Part1 and Part2 tips but not the last one. Could you please help.

https://www.mssqltips.com/sqlservertip/6202/create-an-azure-function-to-connect-to-a-snowflake-database--part-1/
https://www.mssqltips.com/sqlservertip/6216/create-an-azure-function-to-execute-sql-on-a-snowflake-database--part-2/
??

Thanks,
Nanda

Friday, July 31, 2020 - 10:10:26 AM - Koen Verbeeck Back To Top (86227)

Hi Rikin,

when I execute a SQL statement from ADF on Snowflake, ADF waits till the statement has finished.

How are you triggering the stored proc? If you run the same statement in Snowflake directly, does it run synchronously or asynchronously?

Koen


Thursday, July 30, 2020 - 4:38:47 PM - Rikin Back To Top (86218)

Hi I am working on running a stored procedure in Azure data factory using Azure function we need to run the SP in loop but the Azure function doesn't wait for the fist SP run to complete any help would be nice

Thanks

Rikin















get free sql tips
agree to terms