Azure Data Factory Pipeline Email Notification – Part 1

By:   |   Comments (12)   |   Related: 1 | 2 | > Azure Data Factory


I've created a data pipeline in Azure Data Factory. I would like to send an e-mail notification if one of the activities fail or if the end of the pipeline has been successfully reached. However, it seems there's no "send e-mail activity" in Azure Data Factory. How can I solve this issue?


When building ETL pipelines, you typically want to notify someone when something goes wrong (or when everything has finished successfully). Usually this is done by sending an e-mail to the support team or someone else who is responsible for the ETL. In SQL Server Agent, this functionality comes out-of-the-box. See for example the tip How to setup SQL Server alerts and email operator notifications for more information. In Azure Data Factory (ADF), you can build sophisticated data pipelines for managing your data integration needs in the cloud. But there's no built-in activity for sending an e-mail. In this tip, we'll see how you can implement a work around using the Web Activity and an Azure Logic App.

Sending an Email with Logic Apps

Logic Apps allow you to easily create a workflow in the cloud without having to write much code. Since ADF has no built-in mechanism to send e-mails we are going to send them through Logic Apps.

In the top left corner of the Azure Portal, choose to create a new resource.

create new resource

Choose the Logic App resource from the list and click on Create in the next blade.

add logic app resource

You will be asked to specify some details for the new Logic App:

specify name and other details

Click on Create again to finalize the creation of your new Logic App. After the app is deployed, you can find it in the resource menu. Click on the app to go to the app itself. There you can go to the editor by clicking on Edit or on the Logic app designer link.

newly created logic app

When the designer is opened for the first time, you can either choose to start with a new canvas using a common trigger (the event that will start the workflow) or by choosing one of the many available templates.

choose trigger or template

In this tip, we need the HTTP request ("When a HTTP request is received") as the trigger, since we're going to use the Web Activity in ADF to start the Logic App. From ADF, we're going to pass along some parameters in the HTTP request, which we'll use in the e-mail later on. This can be done by sending JSON along in the body of the request.

specify JSON schema

The following JSON schema is used:

    "properties": {
        "DataFactoryName": {
            "type": "string"
        "EmailTo": {
            "type": "string"
        "ErrorMessage": {
            "type": "string"
        "PipelineName": {
            "type": "string"
        "Subject": {
            "type": "string"
    "type": "object"

We're sending the following information:

  • The name of the data factory. Suppose we have a large environment with multiple instances of ADF. We would like to send which ADF has a pipeline with an error.
  • The e-mail address of the receiver.
  • An error message.
  • The name of the pipeline where there was an issue.
  • The subject of the e-mail.

In the editor, click on New step to add a new action to the Logic App:

add new step

This new step will send the e-mail. When you search for "mail", you'll see there are many different actions:

sending notifications from azure data factory pipeline 008

There is support for Office 365, but also for (the former Hotmail) and for Gmail. Even plain SMTP is available. In this tip, I'll use the

First, you'll need to sign in:

sign into service

Make sure pop-ups are allowed (the browser Edge gave me lots of trouble with this. Even turning off pop-ups didn't seem to work. Chrome worked without issues):

pop-up blocked

Log in and give consent that Logic Apps can access your mailbox. In a production environment, you'll probably want to use a mail-enabled service account, where the mailbox (and calendar, contacts ...) are empty. If you have two-factor authentication set-up, you might also get a warning on your mobile device.

give consent

Once you're logged in, you can configure the action. We're going to use dynamic content to populate some of the fields.

configure send mail action

The result looks like this:

configured action

For, there's also a V2 for the send mail activity. The difference is that you can now send rich content. An exaggerated example:

send mail v2 with rich content

You can format text, but also include hyperlinks (maybe a link to a log file?). This action is still in preview at the time of writing and as such might behave buggy from time to time. For example, using bullet points put the entire body in a bullet point, instead of just a portion of the text. For the remainder of this tip, we'll use v1 of the send mail action.

At this point the Logic App is finished. If you try to test it using the run button, the Logic App will fail as no body as passed along in the HTTP request.

run logic app in designer

The following error is returned:

logic app fail

So how can we test our app? We can use an online API tester, like Using this website, you can construct a POST HTTP request where you pass along a body with a content of your choosing.

test logic app

Don't forget to pass along a request header where the content type is set to application/json, or the Logic App will fail again. When you hit test, the POST request will be sent to the Logic App endpoint. The website will let you know if the HTTP request was successful, and it will also show the response headers:

response headers

You can check the run history of the Logic App in the Azure Portal:

run history

And of course, you can also verify if the e-mail has arrived in your inbox:

email sample

In the next part of the tip, we'll explain how you can integrate the Logic App into the ADF pipeline.

Next Steps

sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip

About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips

Comments For This Article

Tuesday, July 6, 2021 - 9:03:36 AM - Koen Verbeeck Back To Top (88948)
Hi Alpa,

This should be possible in the logic app. You first retrieve the blob from storage and then you specify the contents in the attachment field. You can find an example here:


Monday, July 5, 2021 - 5:31:12 PM - alpa buddhabhatti Back To Top (88946)
Thank you Koen for sharing this. Its really useful. I have a quick question on this . Can we attached file from blob storage or sftp location for sending email from ADF?

Thursday, November 19, 2020 - 8:26:54 AM - Vijeth Back To Top (87810)
Thanks for the wonderful article. It really helped me understand and use Data Factory more efficiently.

Thursday, October 3, 2019 - 5:47:00 PM - Brad G Back To Top (82662)

Hi Koen,

Wanted to say thanks for publishing this 2 parter on email notifications for Azure Data Factories. Very helpful and nicely documented. Much appreciated. It worked exactly as you wrote it up. I have not gotten to the parameterized pipeline method yet but will soon. I've just gotten into ADF and have been missing the email notification option from SSIS.

Friday, August 30, 2019 - 10:06:16 AM - Koen Verbeeck Back To Top (82196)

Hi Siddhesh,

use an SMTP server where you can login using a simple username and password?

Thursday, August 29, 2019 - 4:57:57 AM - Siddhesh Back To Top (82178)

When we configure connector in Logic Apps, it asks for login and my organization has set two factor authentication.

My Organization identity team is not allowing to provide exception for two factor authentication. 

Any workaround in this case.

Monday, July 1, 2019 - 2:00:18 AM - Koen Verbeeck Back To Top (81647)

Hi Phil,

actually, part 2 has been published, but apparently the title was changed a bit. You can find it here:


Friday, June 28, 2019 - 4:35:48 PM - Phil Parkin Back To Top (81629)

Even though part 2 has not yet been published, I was able to call the Logic app from a Data Factory pipeline. I was also able to fully parameterise all of the arguments. This article was very helpful because it set me on the right path, thank you.

Tuesday, April 9, 2019 - 8:20:04 AM - Koen Verbeeck Back To Top (79503)

Hi Austine,

regarding your first question: maybe the logic apps skips files that it already has processed?

Regarding your second question: if the file is stored in blob storage (or data lake), you should be able to retrieve the file contents within the logic app. For example, for blob storage you have the get blob contents action. In the send an email action, you can use dynamic content to put the file content into the body of the email.


Friday, April 5, 2019 - 12:33:18 AM - Austine Back To Top (79475)

Thanks so much for your reply, it helps me alot, however, When i configured it, it works just fine but when i add same files to the folder, logic app keep skipping without doing anything. I have try to search for solution online but found nothing.

My second question:

I have a validation that is done on data bricks and link to my pipeline, this validation check files content and when it fails, it write into a log folder in Data Lake ( Just a single line of message) , my task is to copy that error message and send it via email.

I have email notification configured as you did on your part 2 but i am struggling on how to get the lastest error message and pass to the email.

I am thinking of using a lookup but then i dont know how to go about it..Thanks for your help. Much appreciated.


Wednesday, April 3, 2019 - 2:01:14 PM - Koen Verbeeck Back To Top (79458)

Hi Austine,

sure, there's an action "create a new pipeline run" available in Azure Logic Apps. When creating a new action, just search for data factory and it will be there in the actions.


Wednesday, April 3, 2019 - 10:17:05 AM - Austine Back To Top (79453)


Nice blog you got.

I am wondering if it is possible to call a pipeline in data factory from logic app.

I have a logic app that check if a file exist in ftp folder. i want to be able to call my pipeline once the logic apps trigger.

Is there a way to do this

Many thanks

get free sql tips
agree to terms