SQL Server Data Tools Tutorial – Implementation to Deployment


SSDT tooling is the Microsoft Standard for Database Version Management and Release Automation. This article spans usage of SQL Server Data Tools (SSDT) and their integration in existing DevOps processes. We will start with the basics explaining importance and usage of SSDT followed by a sample implementation covering multiple related Databases. At the end we will cover the process to automate Release and Deployment Process using available CICD tools and processes.

Two related Database Projects will be implemented to add a bit of complexity and discuss specific scenarios that have to be handled using a specific procedure. This sample Solution has been made available in public GitHub Repo and details will be shared below. Lets get started.

GitHub Repository Details

You can find solution containing sample projects using this GitHub Repo link.

Whats Covered in this Article

  • SSDT Basics
  • Sample Projects Structure and Problem with Recursive Project References in Visual Studio
  • DACPAC Output Format Explained
  • Usage of Pre and Post-Deployment Files
  • Build Actions on SQL Files and their importance
  • SQLCMD variables in Database Projects
  • Publishing Profile and Storing the Profile
  • Release Management in Database Projects
  • SQLPackage.exe for Command-line and CICD Integration

SSDT Basics

SQL Server Data Tools or in short SSDT is part of the Microsoft’s standard tooling for Database version control and release process automation. In case of Visual Studio 2019, they are available as part of standard Visual Studio Installations and in older versions Microsoft has provided all details in MS docs. (Link shared below). SSDT provides a great alternative for Database Development, Maintanance, Version Management and enabling Automated Release process all from within the Visual Studio.

Its important to note that SQL Server Object Explorer that has always been available in Visual Studio products is also part of SSDT framework. Where SQL Object Explorer works in a connected mode with actual local or remote SQL Server, SSDT tools also provides flexibility for offline database development right within the visual studio ie in a completely disconnected mode. This provides great flexibility where developers can implement and make changes in Database Objects (which are basically .sql script files) within Visual Studio and later commit all these changes as a batch when ready for DB deployment to SQL Server.

When deploying to a target database server, SSDT uses Schema Comparision process where it compares the DB Objects in the Database Project with actual DB Objects in the target database server and generates a Change Script for incremental deployment. This change Script will only contain the SQL needed to sync the target database with the objects state in Database project (source). While deployment, there are options available to capture detailed logs and the actual incremental script being deployed.

There is also an option where entire Database can be re-created everytime. This may be useful while setting up a new development environment or reset the databases in developer machines if something goes wrong. These will be explained in the later sections.

For more information on SSDT visit this link.

Some of the core features provided by SSDT are:

  • Databases can be source controlled just like regular source code using Git/TFS/Azure DevOps etc source control systems. This is great in terms of change tracking, audit and have total control.
  • Teams can do offline Development without need of actual SQL Server. This provides more flexibility to work in an offline mode when needed.
  • Database Projects when compiled can either generate DACPAC or *.sql script files. DACPACs are kind of a zip files containing all necessary scripts and metadata information for actual deployment to target database. DACPAC Files can be either deployment automatically using Visual Studio through SSDT framework or else we can use DACPACs for automated database deployments as part of CICD and DevOps processes.
    -SSDT framework have kind of a built-in SQL Server compiler as part of the framework itself. Framework is able to comprehend all standard database objects , custom SQL scripts and dependencies among the Database objects. Developers can make changes in the projects and SSDT will throw errors/warnings if it see any issues in database integrity or any syntax issues. As an example, if any table is being referred in a Stored procedure which is not available in the Project then SSDT will throw compile time error this providing similar behavior as we had in Sql Server Management Studio.
  • We can perform most database related tasks including some bits of administration within Visual Studio. (For complete control and Administration we will need more robust tooling like SQL Management Studio with actual SQL Servers).

Sample Project Structure

Details of the entire Sample Solution are available in Readme file in GitHub Repo and also explained here.

  • There are fundamentally two Database Projects , ProductDatabase and PriceDatabase. You will see third project but that is to handle specific usecase explained later.
  • ProductDatabase contains ProductList and ProductCategory tables. ProductList holds some random basic standard Products belonging to a specific Product Category and have associated Price value.
  • For demonstration, Price for these Products is maintaned in separate PriceDatabase.
  • ProductDatabase has some standard dummy stored procedures , Views, Tables and a Synonym referring Price table from PriceDatabase. All these standard objects are under folder named ‘dbo’.
  • Since this database refer to tables from PriceDatabase, PriceDatabase reference has been added to it.
  • ProductDatabase also has a folder named DMLScripts which holds custom sql scripts for DML operations like inserting master data. Also you should see ReleaseScripts folder that contains Monthly Release based sql scripts under it. This will be explained later.
  • Also you should see Generated DACPAC is part of the Project itself under DACPAC_Output and two more folders containing DACPAC Deployment command and the generated logs and scripts. Again this will be part of later section when we discuss Release Process.
SSDT - ProductDatabase Project Structure - TechnologySplash.com


  • PriceDatabase only contain the table detail under ‘dbo’ folder along with DMLScripts and DACPAC_Output folders as explained for ProductDatabase.
  • PriceDatabase has a view ‘dbo.vwProductPrices.sql’ that refers to a ProductDatabase table as well. This is again handled by implementing a Synonym in PriceDatabase. You wont see these objects directly in this database and reason is explained below.
    a. ProductDatabase refer to PriceDatabase since it has a synonym referring to it.
    b. Visual Studio will not allow a recursive reference back to ProductDatabase and this is standard feature in Visual Studio.
    c. We need a way to still achieve this since without referring to ProductDatabase, synonym referring to ProductList table will not be Resolved.
    a. Let ProductDatabase refer to PriceDatabase as usual and explained in previous section.
    b. Split PriceDatabase project into two different Databases Projects. First contains only the independent objects with no need of cross reference and this is referred here as PriceDatabase. Second is the one that has dependencies on ProductDatabase like for synonyms and may be more. We add ProductDatabase reference to it. Also, since objects in this new project are actually part of PriceDatabase itself, we add PriceDatabase reference to it as well. This concept is also termed as Composite Database in SSDT where we can split or compose a full database with the help of multiple separate Database projects handling specific database objects. You can read more on this in MS docs. c. This approach works since now we have remove recursive project references. This new third database project is here named as ‘PriceDatabaseObjectsWithProductDatabaseDependencies‘.
SSDT - PriceDatabase  and its Dependency Split DB Project Structure - TechnologySplash.com

As a side note
**Synonyms are used to refer to tables from another database. They provide an abstraction layer while cross referencing other databases and objects. As an example, if we have to change Dependent database name then we just need to make change only in synonym definition and everything else stays the same.

Things Common for all the Database Projects

  • All standard Database objects like SPs, Tables, Views, Synonyms etc directly managed by SSDT framework. SSDT will perform Schema compare to generate an incremental script as explained earlier. All sql files holding standard DB objects must have Build Action as ‘Compile’ so that SSDT can understand them.
  • DMLScript folder contains only the scripts that are not standard objects. This includes things like DML statements, Datafixes etc. All DML files are generally set with build action as None since they are deployed as part of Post-Deployment scripts. SSDT dont need to compile them since they are not part of Database schema.
  • ProductDatabase and PriceDatabase projects contains Post-Deployment Script file. it is sued to run an custom scripts and inserting master data post deployments. This script file can either contain SQL scripts directly or they can also refers to sql files inside them using SQLCMD approach. Again build action is set as None for them.

Below shows content in Post-Deployment Script file. You can see the sql file references inside it rather than direct SQL.

DACPAC Output Format Explained

There are multiple compilation output formats available in Database Projects. Output can be a SQL script file or a DACPAC which is also called Data Tier Application Package. DACPAC is like a zip containing multiple files under it. It can be uncompressed to visualize all the files available under it. It contains the model scripts containing DDL statements and changes needed in objects schema and also the pre or post deploy script file containing all the custom SQL required to be executed before or after deployment. Apart from actual SQL scripts, DACPAC also consists of metadata files containing information on these scripts.

DACPACs can be run on a target SQL Server using various utilities and tools available in Visual Studio as well as in SQL Server components. One of them is SQLPackage.exe which is part of SQL Server Installation package and can be used for many functionalities including Export of Schema, Extracting DACPAC from existing database, Publish DACPAC to target SQL Server, Schema Comparision and more. We will be it for Publishing the DACPAC to target SQL Server database.

Below shows the content of DACPAC file when its uncompressed.

Content inside a DACPAC file

You can see more on SQLPackage.exe here.

**As as a side note, similar to DACPAC there is BACPAC format as well which is generated when both schema and data is extracted out of an existing SQL database. This is done using SQL Object Explorer in Visual Studio or using SQL Management Studio (SSMS) by Exporting Data Tier Application from an existing database.

Usage of Pre and Post-Deployment Scripts

Apart from standard Schema Deployment, Databases also need to manage custom DML scripts with things like Master Data Management, DataFixes, Running some data validation scripts etc. We can not use standard DB objects for this purpose since these are not part of DB schema. To handle this, SSDT Database Projects provides Pre and Post Deployment Script files. There can only be one Pre-Deployment and one Post-Deployment script in any given database. As name suggest, Pre-Deployment Script file is used to runs certain sql scripts before actual Schema Deployment and Post-Deployment script is used to run custom SQL scripts post schema Deployment in a SQL Server. There can be many use cases for these and some examples are below:

  • We are going to run an Incremental Database Schema Change script that will affect a set of tables but before that there is a need to take backup of existing data in some specific tables – this is a Pre-Deployment Script usecase.
  • After deployment, we want to run certain DataFix and DML scripts as part of Release – this is a Post-Deployment Script usecase.
  • Before deploying DACPAC, there is a need to check specific data availability in tables OR a need to perform custom Data validation – again this is Pre-Deployment Script usecase.

To know more about them, you can visit this link.

Important – Both Pre and Post Deployment Scripts have Build Action of NONE. This means that SSDT will not perform a compile time check on these files. This is also applicable to all the SQL Script files referred under Pre and Post Deployment Script Files.

Build Actions on SQL Files and their importance

This has been explained in above section and summarized again.

-In Database Projects, any SQL files that have Build Action set to ‘None’ are not picked for a Compile time check for any Database Integrity and validation checks. SSDT ignore these files from all SQL validations.

  • In Database Projects, any SQL files that have Build Action set to ‘None’ are not picked for a Compile time check for any Database Integrity and validation checks. SSDT ignore these files from all SQL validations.
  • SQL script files that have Build Action as ‘Compile’ are picked by SSDT and its SQL Server.

SQLCMD variables in Database Projects

SQLCMD variables provides great flexibility in Database projects. With these, we can avoid hardcoding of specific Object names in Database. We can also use them to control the execution of SQL scripts in Pre and Post-Deployment script files. In this sample, we have used them to refer to dependent projects like in case of synonym definitions. Also, to refer to custom sql files in post-deployment script file we are using SQLCMD syntax.

Example of SQLCMD variables under a Database Project Settings below

SQLCMD variables under project settings.

Publishing Profile and Storing it.

While Publishing/Deploying a Database Project, we can control and set a lot of settings before deployment. These are made available during the Publish Operation and some of the important ones are below.

  • To Re-create Database everytime or just have Incremental Script publishing.
  • Do we want to update target schema if it is leading to data loss or not.
  • Define Connection String for target database.
  • Set values for the SQLCMD variables used. Same will be used during Deployment.
  • Setting the Target Database name for deployment.
  • Backup target Database before deployment
  • Setting up Drop and Ignore rules based on Schema compare between source DACPAC and Target database and much more.

Rather than setting these values with every Deployment, these settings can be stored as a Publishing Profile and than be used for every repeated deployment. This is called Publishing Profile and is saves within the Database Project as an XML file.
As a possible usecase, We can have different profiles for every Environment where for Live environment we only deploy incremental scripts but for local Development environment we re-create entire database everytime to have latest version on every workstation.

Below shows publishing profile xml file under Product Database

Publish Profile in Database Project - SSDT

And below are its content as an example. We can see values of SQLCMD variables, Database connection string and other settings saved in this file.

Settings under Publish profile file.

Release Management in Database Projects

To explain Release Process, an assumption is made for each release to be a monthly process. You should be able to see DMLScripts/ReleaseScripts folder under ProductDatabase and PriceDatabase. Please note that this is one possible approach to manage deployments. Depending on individual needs , process can be customised accordingly. Below is the process:

  • Current Release is always done through CurrentRelease folder.
  • It contains list of all SQL Scripts that have to be run for current cycle. We can name the scripts to define the Order as well.
  • Apart from actual scripts, there is a file named OrderedSQLSteps.sql. This file refer to all the actual sql scripts to run them in an order.
  • Post-Deployment Script file only runs OrderedSQLSteps.sql. This provides an abstraction so that we dont have to make changes in Post-Deployment file with every release cycle.
  • When Deployment happens, Post-Deployment script will run after schema deployment and custom scripts will be run in order.
  • Post release, CurrentRelease folder can be renamed with Month/Year details and possibly a release number. Doe demonstration , you should be able to see Release folders for June and July 2020 in there as an example.

SQLPackage.exe for Command line and CICD Integration

SQLPackage.exe is utility available with SQL Server Installation and also as part of SSDT framework. With SQL Server installation, it can be found under a folder structure something like C:\Program Files (x86)\Microsoft SQL Server\140\DAC\bin. Similary its available with SSDT framework as well. This utility can be used for lot of purposes and functionalities including Exporting Database Schema, Extracting DACPAC from existing database, Publish DACPAC to a target SQL Server, Schema Comparision etc. Here it is used for Publishing process using a commandline. This can be used for database deployment automation and can be integrated as part of CICD pipeline.

For demonstration, this EXE has been used to deploy DACPAC to Re-create database everytime and also to only deploy the change scripts. Under ProductDatabase, you should see two folders with names DACPACDeployment_OnlyChangeScripts and DACPACDeployment_ReCreatedEntireDatabase handling both these scenarios. Also actual command used to deploy the DACPACs is also available under each folder. Fundamentally we need to pass target database connection string, Publishing profile and whether to enable detailed logging or not. Based on Publishing Profile settings file, EXE will control if to recreate the database or deploy incremental scripts.

Below are the details present under ProductDatabase for demonstration

SQLPackage.exe command file containing the actual command and  SQL script and log files generated by it.

Below is the sample of actual content of of SQLPackage.exe command used for this sample.

--Command to Deploy Only Changes Made In Existing Database , Create Diagnostic Logs and Script which will ran during Deployment.
--Profile used here to deploy only the changes in database. No Change in Command but Profile file has been changed for this purpose.
sqlpackage.exe /Action:Publish /Diagnostics:True /DiagnosticsFile:"E:\Blogging\Projects\DACPACDiagnostics.txt" /DeployScriptPath:"E:\Blogging\Projects\DACPACDeployScriptFile.sql" /Profile:"E:\Blogging\Projects\SQLServerDataTools\SqlDataTools_FromDevToProduction\ProductDatabase\ProductDatabase.publish.xml" /SourceFile:"E:\Blogging\Projects\SQLServerDataTools\SqlDataTools_FromDevToProduction\ProductDatabase\DACPAC_Output\ProductDatabase.dacpac" /TargetConnectionString:"Data Source=LAPTOP-SO8773KH;Integrated Security=True;Persist Security Info=False;Pooling=False;MultipleActiveResultSets=False;Connect Timeout=60;Encrypt=False;TrustServerCertificate=False"

Plese note that Actual Integration of this utility with standard DevOps products is not shown but in most cases it should be a straight process. Build Server need to have SSDT or SQL Server installed and through Products like Octopus/Azure DevOps etc we should be able to call the EXE via command line and execute the DACPAC. In Azure DevOps, there may also be the standard SSDT tasks available that can be used to deploy to Database servers as well.

You can see more on SQLPackage.exe here.


I hope that content helps you with learning SQL Server Data Tools and achieving full scale DevOps Automation for Databases. If you have any feedback or see issues in the content, do share via comments and I will try best possible to correct and add more accordingly. I would also like to hear whats your experience with SSDT including issues faced or any alternatives you have used for any specific reasons. Cheers!

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.