Avoid External Dependencies in SQL Server Triggers

By:   |   Updated: 2011-01-26   |   Comments (3)   |   Related: > Triggers


Problem
I sometimes want to perform auditing or other actions in a trigger based on some criteria. More specifically, there are a few cases that may warrant an e-mail; for example, if a web sale takes place that requires custom or overnight shipping and handling. It is tempting to just add code to the trigger that sends an e-mail when these criteria are met. But this can be problematic for two reasons: (1) your users are waiting for that processing to occur, and (2) if you can't send the e-mail, how do you decide whether or not to roll back the transaction, and how do you bring the problem to the attention of the administrator?
Solution
Since anything you do in a trigger is going to hold up the statement that caused the trigger to fire, you should always go out of your way to make sure your processing is as efficient as possible. This is especially true for processes that require waiting for some external event to occur, or that are prone to errors. In the past, the various approaches we've had for sending e-mail from SQL Server - SQL Mail, the sp_OA procedures using CDO, and even Gert Drapers' xp_smtp_sendmail - have all been problematic in their own way: the local mail profile is corrupt, the server has been migrated to 64-bit hardware, the sp_OA procedures are slow, OLE automation has been disabled, or the SMTP server is down.

These days, most DBAs use Database Mail which, because it uses Service Broker, does not exhibit any of these problems - it places your message on an asynchronous queue, and returns control almost immediately. But it is still questionable whether these e-mails should be initiated from within the trigger at all - not just because we want to squeeze every ounce of performance out of our user-visible operations, but also because there is no trivial way to deal with send failures unless we monitor the send queue from within the trigger.

So a typical starting point would be a table and trigger that looks like this:

CREATE TABLE dbo.Orders
(
  
OrderID INT PRIMARY KEY,

  
/* other columns */

  
ShippingMethod VARCHAR(10) NOT NULL
   
CHECK
    (
      ShippingMethod IN
      
(
      
'Express', 'Normal', 'Slow'
      
)
    )
);
GO

CREATE TRIGGER dbo.Order_ShippingNotify
  
ON dbo.Orders
  
FOR UPDATE, INSERT
AS
BEGIN
   SET NOCOUNT ON
;

  
DECLARE
      
@OrderID INT,
      
@body    NVARCHAR(MAX);

  
DECLARE c CURSOR
      
LOCAL FORWARD_ONLY STATIC READ_ONLY
      
FOR
       SELECT
i.OrderID
          
FROM
              
inserted AS i
          
LEFT OUTER JOIN
              
deleted AS d
              
ON i.OrderID = d.OrderID
          
WHERE
              
i.ShippingMethod = 'Express'
              
AND
               (
                  
d.OrderID IS NULL
                   OR
d.ShippingMethod <> 'Express'
              
);

  
OPEN c;

  
FETCH NEXT FROM c INTO @OrderID;
  
  
WHILE @@FETCH_STATUS = 0
  
BEGIN
       SET
@body = N'OrderID: ' + CONVERT(VARCHAR(12), @OrderID);

      
EXEC msdb.dbo.sp_send_dbmail
          
@recipients = '[email protected]',
          
@subject    = N'Order has expedited shipping.',
          
@body       = @body;

      
FETCH NEXT FROM c INTO @OrderID;
  
END

   CLOSE
c;
  
DEALLOCATE c;
END
GO

Note that, because we need to execute a stored procedure for each row (since the trigger can be fired for an update that affects more than one row), we need to use a cursor to loop through and handle them one at a time. So, in addition to sending e-mails in the first place, we also add the overhead of a cursor and a loop while the end user waits for the transaction to complete.

In my environments, I have opted to use a queue table and a background job to handle almost any post-DML operation that does not involve a decision to roll back the original statement from within the trigger. Some events, such as updating a last modified column, are usually simple enough to just keep in the trigger; and on their own, do not justify any additional effort. But I would suggest that anything beyond the source table that is not a critical part of the initial transaction should be deferred to background processing when practical. I'll stick to the e-mail example for now, but other candidates include fetching Active Directory properties for the user who initiated the operation, writing output to a file, or interacting with an external program.

The queue table itself can be quite simple - especially if it only needs to deal with a single type of post-trigger operation for one table (you can of course use a single queue to process as many different types of results as you want). In my case we wanted to keep track of the key (OrderID), the time the queue row was created, and the time the e-mail went out. So, for this example, it would look like this:

CREATE TABLE dbo.OrderShippingQueue
(
  
QueueID         INT IDENTITY(1,1) PRIMARY KEY,
  
OrderID         INT NOT NULL
                  
FOREIGN KEY REFERENCES dbo.Orders(OrderID),
  
ObservationTime SMALLDATETIME NOT NULL
                  
DEFAULT CURRENT_TIMESTAMP,
  
ProcessTime     SMALLDATETIME NULL
);

And the stored procedure to process the queue, borrowing from the trigger above, simply looks for rows in the queue table with a NULL ProcessTime. This procedure can easily be scheduled using SQL Server Agent:

CREATE PROCEDURE dbo.Order_ProcessShippingQueue
AS
BEGIN
   SET NOCOUNT ON
;

  
DECLARE
      
@OrderID INT,
      
@body    NVARCHAR(MAX);

  
DECLARE c CURSOR
      
LOCAL FORWARD_ONLY STATIC READ_ONLY
      
FOR
       SELECT
i.OrderID
          
FROM dbo.OrderShippingQueue
          
WHERE ProcessTime IS NULL;

  
OPEN c;

  
FETCH NEXT FROM c INTO @OrderID;
  
  
WHILE @@FETCH_STATUS = 0
  
BEGIN
       SET
@body = N'OrderID: ' + CONVERT(VARCHAR(12), @OrderID);

      
EXEC msdb.dbo.sp_send_dbmail
          
@recipients = '[email protected]',
          
@subject    = N'Order has expedited shipping.',
          
@body       = @body;

      
UPDATE dbo.OrderShippingQueue
          
SET ProcessTime = CURRENT_TIMESTAMP
          
WHERE OrderID = @OrderID;

      
FETCH NEXT FROM c INTO @OrderID;
  
END

   CLOSE
c;
  
DEALLOCATE c;
END
GO

And finally, the trigger can now be changed to the following, eliminating both the e-mail activity and the cursor for a much simpler, set-based insert into the queue table:

ALTER TRIGGER dbo.Order_ShippingNotify
  
ON dbo.Orders
  
FOR UPDATE, INSERT
AS
BEGIN
   SET NOCOUNT ON
;

  
INSERT dbo.OrderShippingQueue(OrderID)
  
SELECT i.OrderID
    
FROM
      
inserted AS i
    
LEFT OUTER JOIN
      
deleted AS d
      
ON i.OrderID = d.OrderID
    
WHERE
      
i.ShippingMethod = 'Express'
      
AND
       (
          
d.OrderID IS NULL
           OR
d.ShippingMethod <> 'Express'
      
);
END
GO

If you need to send out different e-mails for different states, then you can add columns to the queue table that will make those choices obvious to the consuming stored procedure. For example, you may want the trigger to send out another e-mail, if in the meantime a customer downgrades their shipping choice, so that you don't continue expediting it. You may also want to add additional logic that will prevent an e-mail in the first place, if someone has upgraded to express shipping and then downgraded back to normal or slow.

Also, you could choose to add an additional table to store the metadata about the e-mail content depending on the state, rather than bake subjects and bodies into the stored procedure code - this can also include whether a different recipient should receive e-mails that reflect these different states.

And you could go a step further than this, and use Service Broker to handle the insertion into the queue table, or even process the sending of the e-mail for you (I'll leave that for a future tip). One advantage of the background job is that you can easily fine-tune the frequency that the queue table is polled - if you want it to run every minute, or you only need it to run once an hour, this is a simple change to the job's schedule. Another advantage is that you have an automatic retry mechanism as well as an audit trail that is as permanent as you want it to be.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Aaron Bertrand Aaron Bertrand (@AaronBertrand) is a passionate technologist with industry experience dating back to Classic ASP and SQL Server 6.5. He is editor-in-chief of the performance-related blog, SQLPerformance.com, and also blogs at sqlblog.org.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2011-01-26

Comments For This Article




Wednesday, February 2, 2011 - 4:47:19 PM - Aaron Bertrand Back To Top (12798)

A loop is a loop - whether you say DECLARE CURSOR or not, it is still row-by-row processing with all the baggage that implies.

Anyway, the tip showed explicitly how to *avoid* using the cursor in the trigger, so I'm not sure where you're going with your comment...


Wednesday, February 2, 2011 - 1:23:30 PM - LB213 Back To Top (12793)

A cursor in a trigger is asking for performance issues, best practice is to use a table variable and a loop counter.


Wednesday, January 26, 2011 - 8:53:47 AM - Jack Corbett Back To Top (12716)

Aaron,

This is a great tip.  I often offer similar advice to people on forums and whenever discussing triggers.  I've used the queue table approach in the past with only one difference, and that is that I have not set up the foreign key relationship between the queue table and the source table.  This is because I typically do not use Cascade Deletes and I don't want to stop the deletion of a source row if there is something in the queue table.  Your use has me re-thinking that approach.















get free sql tips
agree to terms