Basic Database Continuous Integration and Delivery (CI/CD) using Visual Studio Team Services (VSTS)


By:   |   Updated: 2018-02-22   |   Comments (9)   |   Related: More > DevOps

Problem

You work as a lead SQL Server database developer for a company that provides digital services to the clients through their database system. You have been asked to implement basic database continuous integration and delivery (CI/CD) to ease the pressure on your development team and improve performance by automating build and deployment processes and complying with development best practices. In this tip we will look at how this can be done using Visual Studio Team Services.

Solution

The solution is to put your database code in an in-house or cloud-based source control system such as VSTS (Visual Studio Team Services) and configure database continuous integration and delivery pipelines for smooth database development and deployment cycles.

This tip is focused on cloud-based source control system (Visual Studio Team Services) for database continuous integration and delivery.

Basic Concepts of Database CI/CD

Let's first go through some of the basic concepts of database continuous integration and delivery.

Database Continuous Integration (CI)

Database Continuous Integration (CI) is a process of automating builds which means a build is triggered every time you or your development team checks database code into source control.

Since the source code is your single source of truth, database continuous integration encourages your development team to frequently put your work under source control which means there are no long shelvesets left in isolation and all developers work can be joined together using database continuous integration.

Database Continuous Delivery (CD)

Database Continuous Delivery means automating database deployments and replacing manual interventions (as much as possible) when the database is prepared to be delivered to any target environment (Dev, Test, QA, etc.) but ultimately to Production.

Smooth Database Continuous Integration (CI) further leads to a smoother Database Continuous Delivery (CD).

Local Build(s) vs. Debugging

Run a local build in VSTS by pressing Ctrl+Shift+B which compiles the database project successfully or fails in case of errors.

SQL Server Database Projects on a successful build deploys database code to the local debug database which is done by debugging the project (pressing F5).

Therefore, debugging in the context of a SQL Database Project does the following things:

  1. Complies (builds) the database objects (code) defined in database project
  2. Deploys the database project to the debug database

Artifacts

Artifacts are the deployment ready files or packages that can be picked up by a Release Manager to deploy to target environments.

DACPAC

DACPAC is a single file (package) which contains all database objects definition ready to be transformed into a database.

In this tip we are going to publish a DACPAC as an Artifact.

Environments

Multiple environments can be setup when automating or semi-automating deployments. The environment can be Test, Pre-Production, Production, QA and even Shared Dev.

Database CI/CD Dependency on Source Control

Please remember that database continuous integration and delivery cannot take place unless you check your database code into source control.

Database CI/CD Setup

Prerequisites

This tip assumes that you already have a Visual Studio Team Services (VSTS) account.  If you do not have one, it is very easy to create a free VSTS account using your Microsoft email address.

Please note that you have to comply with the license terms when using Visual Studio Team Services account and similar services by Microsoft or any other vendor.

VSTS Project Creation

Let's begin by logging into the Visual Studio Team Services (VSTS) account and creating a new Project “Digital Services Database CI and CD” choosing Git version control and Scrum work item process (you can choose any other work item process as well) as shown below:

create new project

Open Visual Studio Project from VSTS

Next click on the “Code” tab in Visual Studio Team Services and then click on the “Clone in Visual Studio” option to open the project in Visual Studio (you must have Visual Studio 2013+):

digital service database

Creating Local Repo (Git repository)

Pressing the “Clone in Visual Studio” button is going to open Visual Studio (provided you have Visual Studio 2013+) and you are required to create a local repository by clicking on “Clone” as shown below:

team explorer

Creating a Database Project

Next you will be asked to create a new project or solution.

digital services

Create a new project “DigitalServices” under “DigitalServicesCICDSolution” (solution) as follows:

sql server database project

Creating Database Objects from Script

Let's create database objects to replicate the scenario where you have built these objects in one go. The reason I have mentioned one go, is because we have not put the database objects under source control. The starting point in a Database CI/CD is to build and check your code into source control (TFS/Git).

Please use the digital services (sample) database script as shown below:

-- DigitalServices Database Setup Script 
-- Haroon Ashraf (09-Feb-2018) 
-- This script can be imported into a SQL Database Project to create the database objects for DigitalServices sample database 
-- Please Note: The script is provided "AS-IS" and should be used at your own discretion 
  
-- USE DigitalServices 
  
  
-- Creating Client Table 
CREATE TABLE [dbo].[Client] 
( 
   [ClientId] INT NOT NULL IDENTITY(1,1) , 
   [Company] VARCHAR(40) NOT NULL, 
   [Email] VARCHAR(320) NOT NULL, 
   [Phone] VARCHAR(50) NULL, 
   [RegistrationDate] DATETIME2, 
   [Status] bit, 
   CONSTRAINT [PK_Client] PRIMARY KEY ([ClientId]) 
); 
GO 
  
CREATE PROCEDURE [dbo].[AddClient] 
   @Company VARCHAR(40) , 
   @Email VARCHAR(320) , 
   @Phone VARCHAR(50) , 
   @RegistrationDate DATETIME2, 
   @Status bit -- 1 active, 0 closed 
AS 
SET NOCOUNT ON 
INSERT INTO dbo.Client 
(Company,Email,Phone,RegistrationDate,Status) 
VALUES 
(@Company,@Email,@Phone,@RegistrationDate,@Status) 
RETURN 0; 
GO 
    
CREATE PROCEDURE [dbo].[RemoveClient] 
    @ClientId INT 
AS 
DELETE FROM dbo.Client WHERE [email protected] 
RETURN 0; 
GO 
  
CREATE TABLE [dbo].[Service] 
( 
   [ServiceId] INT NOT NULL, 
   [Name] VARCHAR(40), 
   [Detail] VARCHAR(1000), 
   [Pricing] DECIMAL(10,2), 
   CONSTRAINT [PK_Service] PRIMARY KEY ([ServiceId]) 
); 
GO 
  
CREATE PROCEDURE [dbo].[AddService] 
   @Name VARCHAR(40), 
   @Detail VARCHAR(1000), 
   @Pricing DECIMAL(10,2) 
AS 
INSERT INTO Service 
(Name,Detail,Pricing) 
VALUES 
(@Name,@Detail,@Pricing) 
RETURN 0; 
GO 
  
CREATE PROCEDURE [dbo].[RemoveService] 
@ServiceId INT 
AS 
DELETE FROM Service WHERE 
[email protected] 
RETURN 0; 
GO 
  
CREATE TABLE [dbo].[ServiceOrder] 
( 
   [ServiceOrderId] INT NOT NULL , 
   [ClientId] INT, 
   [ServiceId] INT, 
   [PurchaseDate] DATETIME2, 
   [AmountPaid] DECIMAL (10,2), 
   [Validity] DATETIME2, 
   CONSTRAINT [PK_ServiceOrder] PRIMARY KEY ([ServiceOrderId]), 
   CONSTRAINT [FK_ServiceOrder_Client] FOREIGN KEY ([ClientId]) REFERENCES [Client]([ClientId]), 
   CONSTRAINT [FK_ServiceOrder_ToTable] FOREIGN KEY ([ServiceId]) REFERENCES [Service]([ServiceId]) 
); 
GO 
  
CREATE PROCEDURE [dbo].[PurchaseService] 
   @ClientId INT, 
   @ServiceId INT, 
   @PurchaseDate DATETIME2, 
   @AmountPaid DECIMAL (10,2), 
   @Validity DATETIME2 
AS 
INSERT INTO ServiceOrder 
(ClientId,ServiceId,PurchaseDate,AmountPaid,Validity) 
VALUES 
(@ClientId,@ServiceId,@PurchaseDate,@AmountPaid,@Validity) 
RETURN 0; 
GO 

Next import the script into the Project after downloading it to a folder:

solution explorer

After a few more steps click the “Finish” button and see your project getting populated with database objects from the script:

database objects

Please configure the database project using the settings mentioned below:

Target Platform (under Project Settings):

Set the Target Platform as desired. In our case we have set it to SQL Server 2012 it can be any upward version as well such as SQL Server 2014 or SQL Server 2016.

Debug Database:

Also, set the debug database to point to your local SQL instance on a dev machine (if you prefer to do so).

digital services

Always Recreate Database Deployment Option:

Please set the “Always recreate database” deployment option to true under Project Settings > Debug.

Debug Database Project to Kick-Off Local Build

Press F5 to debug the project which will kick off the BUILD and then deploy the code to the debug database using local dev settings:

debug database

Putting Project under Git Source Control

The next step is to check your code into source control by pressing CTLR+ALT+F7 keys anywhere in the Solution Explorer and then adding a comment “Basic database objects ready” and then clicking on the “Commit All and Push” button as shown below:

team expolorer

Now padlock icons appear next to the project items (stored procedures, tables, etc.) showing they have been checked into source control:

digital services

Database CI/CD Implementation

Now that the project is under source control it meets the minimum requirements to implement database continuous integration and delivery (CI/CD).

Creating New Build Definition in VSTS

The first step is to define a new build definition in Visual Studio Team Services to automate the Build Process.

Navigate to the Builds Tab in VSTS and click on “New Definition” as follows:

build definitions

Next, keep the default settings as they are and click on “Continue” as shown in the figure below:

this account

Then in the next step start with an empty template:

apply

Select “hosted” agent queue from the drop-down box as shown below:

digital services

Then add a new task “MSBuild” by clicking the “+” Symbol next to Phase 1 as follows:

msbuild

Next, locate the Solution file by clicking on the ellipses next to the Project text box as follows:

build and release

Then point it to the solution file as follows:

select path

Add MSBUILD arguments “/t:build /p:CmdLineInMemoryStorage=True” as shown below:

msbuild arguments

Next, add another build task “Copy Files” as shown in the figure below:

build release

Let us now point to the DACPAC file which is by default produced by a successful build bin/Debug folder under the project folder:

process

Then add another Task “Publish Build Artifacts” as follows:

add tasks

Please add the following settings in this task:

publish build artifacts

Enabling Database Continuous Integration

Next navigate to the “Triggers” tab under VSTS Build and Release and check “Enable continuous integration” and then click on “Save and queue” option as follows:

enable continuous

Next add the following and then click on “Save & queue”:

save build definition and queue

Queuing and Running Build

Next click on build number and you will see the build getting queued and then up and running in a few seconds:

build succeeded

Test Running Database CI/CD

Now create a view “Clients” in the database project to see all the clients using the following code:

CREATE VIEW [dbo].[Clients]
AS SELECT [ClientId], [Company], [Email], [Phone], [RegistrationDate], [Status] 
FROM Client
			

Put the database code changes under source control as shown below:

team explorer

Instantly go to the Builds in VSTS as follows:

build definitions

We can see the build has been triggered automatically as soon as the code was put under source control.

Now after a few seconds, check the status of the build:

build succeeded

Artifacts Check

Finally check the Artifacts to see the DACPAC:

dacpac

So your Database Continuous Integration (CI) is working and your DACPAC is ready for further Continuous Deployment or Delivery.

A simple scenario for Continuous Delivery (CD) can be automating the task of pushing the Artifact (DACPAC) to the target environment (Test, QA, Pre-Production or Production) to be deployed; however, the detailed steps are beyond the scope of this tip.

Next Steps
  • After going through the process, download the published Artifact (DACPAC) and run locally to see if you can create a database from it.
  • Please go through my previous Tip and see if you can replicate developing a multi-customer database using database continuous integration and delivery.
  • Please have a look at my previous Tip to add salt unit tests to your project and see if you can add those tests in VSTS Build
  • Try creating the whole project mentioned in my previous Tip using test-driven development (TDD).
  • Learn more about SQL Server Data Tools from the MSDN Library.
  • Please explore further Database CI/CD by visiting the VSTS home page.


Last Updated: 2018-02-22


get scripts

next tip button



About the author
MSSQLTips author Haroon Ashraf

Haroon Ashraf's interests are Database-Centric Architectures and his expertise includes development, testing, implementation and migration along with Database Life Cycle Management (DLM).

View all my tips





Post a comment or let the author know this tip helped.

All comments are reviewed, so stay on subject or we may delete your comment. Note: your email address is not published. Required fields are marked with an asterisk (*).

*Name
*Email
Email me updates

Signup for our newsletter

I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.





Thursday, April 05, 2018 - 3:16:32 AM - Haroon Ashraf Back To Top

I would like to share that the article about adding reference data in database continuous integration pipeline is available now.

Please click the link below to see the article:

https://www.mssqltips.com/sqlservertip/5421/adding-reference-data-in-database-continuous-integration-ci-pipeline/


Friday, March 02, 2018 - 5:43:04 PM - Haroon Ashraf Back To Top

Adding reference data in database continuous integration and delivery workflow is a very good point, which requires a more solid approach to get reference data in sync with all the environments (sandboxed, dev, test and production) first.

Please stay in touch with MSSQLTips as more tips of similar type are on their way.

 

 


Friday, March 02, 2018 - 3:03:55 PM - Haroon Ashraf Back To Top

Thank you, John, for your kind remarks.

Yes, it would be really nice to get familiar with your database CI/CD workflows which look quite promising as I said earlier.

Meanwhile, please stay in touch with MSSQLTips as if I write more about database continuous integration and delivery your comments can add more to it.

Thank you once again for your valued comments.

 


Thursday, March 01, 2018 - 9:55:50 PM - John Shkolnik Back To Top

A great sentiment with which I agree 100%, Haroon.

I'd be happy to speak with you live about how we're doing CI/CD if you're ever interested.


Thursday, March 01, 2018 - 11:17:18 AM - rr Back To Top

 Can you please add how to include static/reference data with the CI/CD mentioned in this article. 


Wednesday, February 28, 2018 - 6:52:11 PM - Haroon Ashraf Back To Top

Thank you very much for your detailed comments.

Let us not move any further to prove which of the two unit-testing frameworks is best, why not discuss some common points which others may find helpful.

Let us refer to the same video by Eric about database unit-testing with Slacker in which has given a beautiful guideline for a good unit-testing framework which is as follows:

  1. Easy to setup and run
  2. Flexible to mock or generate test data
  3. Easy to develop test logic
  4. Easy to debug test code or code issues
  5. Seamless integration with CI&CD Pipeline

Slacker aleady complies with all the above mentioned things but this is also true for tSQLt unit-tests.

tSQLt does not require actual database to be changed to support the test if we are working with SSDT (SQL Server Data Tools) and I assume same is true for Slacker.

Database Project in SSDT serves as source of truth, apart from the benefits it offers, when there are multiple environments and multiple versions of the same database floating around but it is no way a limitation.

Connected development mode in SSDT does not require any model at all, rather we work directly on database objects depending on the requirements.

Thank you once again for sharing your CI/CD workflows which look quite promising. :)

 

 

 

 

 


Tuesday, February 27, 2018 - 9:59:14 PM - John Shkolnik Back To Top
  1. This is true. Then again, it's quite easy to pick up Ruby syntax.
  2. SQL isn't really conducive to writing this sort of logic. Does tSQLt have an equivalent to the reusable erb templates so the SQL called by unit tests can be assembled dynamically? Like this: https://github.com/vassilvk/slacker/blob/master/lib/slacker_new/project/sql/common/sproc.sql.erb
  3. The csv's are an option for readability and reusability; you can do it all inline if you want. Same is true for the functions. That's like saying SQL scalar functions are a drawback.
  4. I don't know what you mean by this because, unless I misunderstood, Slacker does that as well.
  5. This is true but it doesn't seem like it'd take that much to add. A better question is how the concept of a unit test can be applied to something as large as a cross-database scenario?
  6. tSQLt is more popular, yes, but it's been around longer and something was better than nothing. That doesn't make it better.

Correct me if wrong but doesn't tSQLt require the actual database being tested to be changed in order to support the tests? This seems like a fundamental no-no. For example, setting a database to trustworthy can easily allow something to work which would otherwise fail.

Our database projects are not cumulative from inception. We can recreate our databases as of any point in time as if they were "born" like that (with all system data) rather than having to deploy a model and upgrade it to current. Our CI builds deploy a temporary database instance as of that point in time, run Slacker tests (and publish the results to the test tab via adapter), and then drop the database. For our release pipelines, we also layer in functional tests by seeding in data and seeing how the instances perform both as new deployments as well as upgraded from previous point in time.

Eric Kang posted a video last year demonstrating Slacker: https://channel9.msdn.com/Shows/Visual-Studio-Toolbox/SQL-Server-Database-Unit-Testing-in-your-DevOps-pipeline


Sunday, February 25, 2018 - 2:13:28 PM - Haroon Ashraf Back To Top

Thank you for sharing your valued comments.

Yes, Slacker is another competitive database unit-testing framework as you mentioned.

We have the following benefits when tSQLt is used for unit-testing framework:

(1) With tSQLt no context switching is required (your developers don’t need to switch to any other language from SQL) as in case of Slacker which is RSpec based testing tool written in Ruby

(2) tSQLt unit tests inheritly support test-driven development in such a way that your development team writes SQL object code as test in the same way they write object code in first place in the absence of test driven development

(3) Slacker tests  depends on function calls to mock data which is further stored as separate csv files while tSQLt unit tests are self contained and donot need to refer to any external reference data rather reference data is stored in the same test

(4) Isolating dependencies for an object under test in tSQLt has an edge over other database unit testing framework 

(6) tSQLt supports cross database unit testing 

(5) tSQLt is the underlying unit testing framework behind many famous commercial database testing tools.

However, the choice is yours and it depends on your requirements and preferences.

With Regards to deploying and dropping database during CI build, can you please give some more information?

Thank you once again for sharing your comments with us.


Saturday, February 24, 2018 - 9:12:42 PM - John Shkolnik Back To Top

We deploy and drop our database during the CI build. We also run database unit tests using Slacker which is superior to tSQLt (in my opinion.)



download

























get free sql tips

I agree by submitting my data to receive communications, account updates and/or special offers about SQL Server from MSSQLTips and/or its Sponsors. I have read the privacy statement and understand I may unsubscribe at any time.



Learn more about SQL Server tools