Send Notifications from an Azure Data Factory Pipeline – Part 2

By:   |   Comments (13)   |   Related: 1 | 2 | > Azure Data Factory


Problem

When you run a pipeline in Azure Data Factory, you typically want to notify someone if the load was successful or not. Especially if there are errors, you want people to take action. However, there is no send email activity in Azure Data Factory. In part 1 of this tip, we created a Logic App in Azure that sends an email using parameterized input. In this part 2, we will integrate this Logic App into an Azure Data Factory (ADF) pipeline.

Solution

If you haven't already, please read part 1 of this tip to learn more on how to set up an Azure Logic App that sends an email every time an HTTP request is received. To summarize, the Logic App expects the following JSON input:

{
    "properties": {
        "DataFactoryName": {
            "type": "string"
        },
        "EmailTo": {
            "type": "string"
        },
        "ErrorMessage": {
            "type": "string"
        },
        "PipelineName": {
            "type": "string"
        },
        "Subject": {
            "type": "string"
        }
    },
    "type": "object"
}			

Using this JSON body, we can customize the message – and thus the email - we want to send from ADF to the Logic App.

Triggering the Logic App from ADF

Suppose you have already created a pipeline with some activities in ADF.

ADF pipeline

For a short introduction to ADF, please refer to the tip Azure Data Factory Overview. Now we want to send an email if an error occurs within an activity. Add a web activity to the canvas and connect another activity to this new activity using the arrow. When the connection has been made, right-click on the connection to change it to a Failure precedence constraint.

Add failure constraint to Azure Data Factory Pipeline

This will change the color of the connector to red.

red constraint in ADF

Now we need to go the Azure Logic App and copy the HTTP POST URL:

HTTP POST URL

Paste this URL into the URL field in the settings pane of the Web activity. The method needs to be set to POST.

url and post method

We also need to add a header, where we will set the Content-Type to application/json. In the body, we enter the following JSON (following the structure mentioned before):

{
   "DataFactoryName": "@{pipeline().DataFactory}",
   "PipelineName": "@{pipeline().Pipeline}",
   "Subject": "An error has occurred!",
   "ErrorMessage": "The ADF pipeline has crashed! Please check the logs.",
   "EmailTo": "[email protected]"
}			

We're using system parameters to retrieve the name of the data factory and the name of the pipeline. All the other fields in the settings pane can be left as-is.

configured

At the end of the pipeline, we can add a similar Web activity to send an email notifying users that the pipeline has finished successfully. This time we use the following body:

{
   "DataFactoryName": "@{pipeline().DataFactory}",
   "PipelineName": "@{pipeline().Pipeline}",
   "Subject": "Pipeline finished!",
   "ErrorMessage": "Everything is okey-dokey!",
   "EmailTo": "[email protected]"
}			
Finished pipeline.

Now we can run the pipeline and wait to see if any emails come in:

oopsie

Configuring Multiple Notifications

Suppose we have more than one single activity for which we want to send a notification if something goes wrong. The following won't work in ADF:

multiple failure constraints

The reason it doesn't work is because all those failure constraints (the red arrows) are linked together using an AND constraint. Which means the "Send Failure Email" activity will only be executed if (and only if) ALL activities fail. This can't happen, since the second activity will only start if the first one has completed successfully. In ADF, we cannot configure those constraints to be linked by an OR constraint (like we can in an Integration Services package), which would mean the email will be sent once one of the activities fails. You can find an example of the desired behavior in the SSIS tutorial: Execute SQL Task in SQL Server Integration Services SSIS.

So how can we solve this issue in ADF? It's considered a best practice that if you have re-usable code, you put it in its own pipeline. This means we'll create one single pipeline with the Web Activity. Then we'll call this pipeline from the main pipeline every time we want to send a notification and we pass along the desired parameters to construct the JSON body. Let's start by creating the new pipeline:

add new pipeline

Let's copy paste the Web Activity from the main pipeline into this new pipeline.

add web activity to notification pipeline

Notice that the activity is now "neutral": it doesn't know if it's going to send a failure or success email. We have to adapt the body so the activity can use parameters, which we'll define later:

{
   "DataFactoryName": "@{pipeline().DataFactory}",
   "PipelineName": "@{pipeline().Pipeline}",
   "Subject": "@{pipeline().parameters.Subject}",
   "ErrorMessage": "@{pipeline().parameters.ErrorMessage}",
   "EmailTo": "@pipeline().parameters.EmailTo"
}			

We're going to use three parameters: Subject, ErrorMessage and EmailTo. We can reference this variables in the body by using the following format: @pipeline().parameters.parametername.

Click on the pipeline canvas and then go to the Parameters pane of the pipeline. Here we can add our three parameters:

configure pipeline params

As default values, I'm specifying strings that are obvious not useful values. This way, if something goes wrong with the parameter mapping, it's easy to detect if a parameter value was used or the default value. The notification pipeline is now finished. The next step is to modify the main pipeline. All the web activities will be replaced with an Execute Pipeline Activity.

full pipeline

In each Execute Pipeline Activity, we define our three parameter values which we're going to pass to the notification pipeline. In the Settings pane, choose the correct pipeline and click on New to add new parameters (you have to click open the Advanced section to see this):

configure settings

Here you can add the three parameters for the notification pipeline:

add parameters

Notice it is possible to have a more useful ErrorMessage if you use dynamic contents. For example, for the notification of the first stored procedure activity, we can do the following:

dynamic content

This will retrieve the actual error message from the activity, instead of using a hardcoded message. According to the documentation, this shouldn't be possible, but it still works. If we run the pipeline and the stored procedure activity fails, we get an email with the error message:

email with custom message

The format of the dynamic content can differ per activity (and some activities don't let you access the error details), so please refer to the documentation for more info.

Conclusion

In this two-part tip, we saw how we can leverage Azure Logic Apps to send email notifications in Azure Data Factory. Using parameters and dynamic content, it's possible to pass along actual error messages for certain activities.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips



Comments For This Article




Wednesday, October 7, 2020 - 1:23:42 AM - Koen Verbeeck Back To Top (86611)
Hi Barghav,
I just tested it and my notification stills sends the error message from the sp. So it's an undocumented feature :)
This is the expression I used:
@{activity('SQL Log Start').Error.Message}

Wednesday, October 7, 2020 - 1:07:27 AM - Koen Verbeeck Back To Top (86610)
Hi Barghav,
yes, you can't catch the error message directly from the sp, unfortunately. It's documented here: https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-stored-procedure.
It used to be that you could catch it, but for some reason they changed it.

Koen

Tuesday, October 6, 2020 - 2:24:07 PM - Bhargav Kandala Back To Top (86609)
Also instead of a generic error message if i type in @{activity('CLEAR TABLES').Error.Message) the job is failing with the below error message
{"code":"BadRequest","message":"ErrorCode=InvalidTemplate, ErrorMessage=The expression '@{activity('CLEAR TABLES').Error.Message)' is not valid: the string interpolation segment starting at position '0' is not terminated. Make sure that every '@{' is followed by its '}'.\"","target":"pipeline/Load Mayor_Covid19 ADL to SQLMI/runid/81133fdb-f724-459d-8493-9213e06ea852","details":null,"error":null}

Tuesday, October 6, 2020 - 2:04:42 PM - Bhargav Kandala Back To Top (86608)
Thanks Koen,
I figured that later regarding the web activity. My issue currently is i have a "TRUNCATE TABLE" stored procedure, and 5 COPY activities. I added send email on error to the stored procedure out put, when it fails it doesn't give me the error message encountered by the stored procedure, but i have to hardcode it.

Friday, September 25, 2020 - 1:51:35 AM - Koen Verbeeck Back To Top (86533)
Hi Bhargav,

the paragraph starts with "Suppose you have already created a pipeline with some activities in ADF.". Those web activities are just part of that sample pipeline and are irrelevant for sending the email. They could've been any other ADF activity.
If sending an email doesn't work, do you get any error in ADF? Can you send an email when you run the Azure Logic App seperately?

Regards,
Koen

Thursday, September 24, 2020 - 4:46:38 PM - Bhargav Kandala Back To Top (86531)
Why do you have web activity named Start IR and no mention of it in the document.. I have a sproc that truncates all tables before getting loaded using the COPY process. I edited the sproc by giving table name that doesn't exist and when i run the pipeline it failed , but no email was sent. Please advice.

Thursday, September 24, 2020 - 3:15:21 PM - Bhargav Kandala Back To Top (86530)
Thanks for the article. You have 2 web activities , you did not mention why, what are the settings for the first one. wish i could upload a picture of my pipeline. I did everything except i do not have a clue as to why you had 2 web activities and when executed my pipeline failed but i did not get an email

Thursday, April 16, 2020 - 11:24:47 AM - Angéla Back To Top (85389)

Thank you this helped a lot on my project


Tuesday, October 8, 2019 - 1:20:19 PM - paul Back To Top (82697)

Hi Koen,

I am in the middle of buiding and testing a solution in ADF right now so I am familiar with all the concepts you have used in this article.  I would have used the same idea as this article to log errors, but found the Error json object was available inside the executing pipeline.

I tried different permutations and was always told the object is not available.

Also posted a reply to this thread

https://social.msdn.microsoft.com/Forums/en-US/2d2b3736-4a76-4177-88c9-fef7e863b049/parse-the-error-json-message-on-azure-data-factory-and-pass-it-to-another-activity-as-input?forum=AzureDataFactory

And documented all the sources that indicate this is no longer available as of the latest version of ADF.

Paul


Friday, October 4, 2019 - 10:16:52 AM - Koen Verbeeck Back To Top (82671)

Hi Paul,

thanks for letting me know.
According to official docs, it should have never worked but it certainly did at some point in time. I submitted an issue to the documentation saying it was indeed possible and they were going to adapt the docs, but apparently they choose the Azure Monitor route.

Do you know if it actually stopped working, or they did just change the docs?

Regards,
Koen


Thursday, October 3, 2019 - 4:45:25 PM - paul Back To Top (82661)

Unfortuntely this no longer works now

According to this post https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-stored-procedure

And the closed issue discussions : https://github.com/MicrosoftDocs/azure-docs/issues/18481

It is by design they do not allow the retreival of error message from inside the pipeline.  And all errors are funneled to the Azure monitor service.


Wednesday, September 4, 2019 - 6:57:46 AM - Koen Verbeeck Back To Top (82234)

Hi Prathik,

I'm not sure. You can try to monitor the ADF pipeline yourself using .NET:

https://docs.microsoft.com/en-us/azure/data-factory/monitor-programmatically


Tuesday, September 3, 2019 - 10:04:50 AM - Prathik Back To Top (82222)

How to attach a log file from ADLS or BLOB in the mail? 















get free sql tips
agree to terms