Automated Database Deployment Using OAUTH in Azure DevOps

This article explains one of the ways we can use to perform automated database deployments to Azure SQL from Azure DevOps CICD pipelines. Process will make use of JWT oauth token and then passing it to the sql connection to perform necessary database operations. This directly helps DevOps teams to move away from using database credentials (userid and password) and hence making deployment and CICD process more secured. This post does not cover any aspects of Azure SQL databases , Microsoft Entra ID or Azure DevOps pipeline and it is assumed that the implementation team already has enough knowledge and permissions to access and use all of these cloud resources.

Steps Involved are as below –

  • Create a new App Registration (Application) in Microsoft Entra ID (previous called Azure Active Directory)
  • Add this application as an external database user in Azure SQL Database
  • Grant necessary SQL database roles to this external user like db_datareader and db_datawriter. Since we want to deploy changes to our database , in general write permissions will be required for the user.
  • Create a Azure DevOps YML pipeline with a agent job that will handle the database deployment.
  • Add an Azure powershell task to database deployment agent job. This powershell script will handle all the core logic of connecting to the database and then deploying any SQL Script files to it. it is assumed that developer also has good understanding on how to use powershell and create powershell functions for automating various kind of tasks. We are using Azure powershell task here since it runs all standard powershell commands along with azure powershell commands as well. But this process can also be handled with regular powershell task as well.
  • Idea is to get OAUTh token from Microsoft Entra ID by calling accesstoken endpoint for our Entra Application and then passing this token as part of database connection string to connect to the database. Azure SQL Database will be able to connect with this approach because our Entra ID application will be having the necessary database permissions (since we added it previously to the database) and JWT token will have the necessary database scope in it.
  • Create a SQL script with some DDL or DML commands for testing the entire process and save the script as a sql file in your Azure DevOps repository. For testing purposes, we can even have a sql script inline but this will not be the case in any production based applications and CICD process.
  • Finally, run the pipeline and test

Lets see each step in a bit detail below –

Step 1 – Create a new App Registration in Microsoft Entra ID

Entra ID Application

We also need to add a Client Secret to this Application as shown below. Client Secret is available for viewing only when its created. Note if down for later use in YML pipeline

Once Application is created in Entra ID, note its Client ID from the main application screen. This will be later used in the pipeline for deployment.

Step 2 – Add this application as an external database user in Azure SQL Database and grant the necessary roles

CREATE USER [SQLConnectionAppViaAzureDevOps] FROM EXTERNAL PROVIDER;
ALTER ROLE db_datareader ADD MEMBER [SQLConnectionAppViaAzureDevOps];
ALTER ROLE db_datawriter ADD MEMBER [SQLConnectionAppViaAzureDevOps];

Step 3 – Create a Azure DevOps YML pipeline with a agent job that will handle the database deployment.

This is not explained here in depth and is a separate vast topic to cover. It is assumed that developer already has understanding on this topic. A short sample for database deployment agent job in the pipeline could like below. It is assumed that all the parameters required by our powershell script will be read from Pipeline variables.

trigger:
- main
stages:
  - stage: Release
    pool:
      vmImage: 'YourCompanyDeploymentAgentPool' #Agent pool should be able to connect to Azure SQL and hence firewall rules should be in place in Azure SQL
    jobs:
    - job: 'database deployment'
      displayName: 'Automated DB Deployment with Oauth'
      steps:
        - task: AzurePowerShell@5
          displayName: 'Run Database Deployment Script'
          inputs:
            ScriptPath: '$(ArtifactsFolderPath)/Scripts/DeployDatabase.ps1'  #script location is assumed to be coming from a build stage OR directly from Repository
            ScriptArguments: '
                 -DBScriptsLocation "$(DBSQLScriptsFolder)"    #correct path of the sql script files that we want to ultimately execute in the azure sql database
                 -DatabaseServerName "$(AzureSqlServername)"  #Server Full Name in Azure
                 -DatabaseName "$(AzureSQLDatabaseName)"      #Database name
                 -EntraAppClientID "$(EntraAppClientID)"              # ClientID of the Entra ID Application we created as first step
                 -EntraAppClientSecret (ConvertTo-SecureString $(EntraAppClientSecret)" -AsPlainText -Force)  #Client Secret of Entra ID Application added as part of first step. We have to put that secret either in Key Vault or as a secured variable in Pipeline variables. And to read the secured variable as a secured string, we need to use this powershell function.
                 -EntraTenantID "$(EntraTenantID)"  #Tenant ID of Entra ID where we have all our cloud resources.
                 -SQLUser "NA"   #script is designed to be usage both with JWT or SQL credentials. Here since we focusing on JWT , this can be passed as NA
   '             -SQLPassword "NA"  #password as NA
                 -ConnectionType "JWT" 

Step 4 – Create a SQL script with some DDL or DML commands for testing

For Testing purpose create some SQL script files that we want to run in the azure database and put them in some folder in your azure devops repository. Intention is to read these sql files and then deploy then through our CICD process. SQL script files are not explained or covered as part of this topic and is assumed to be there.

Step 5 – Create actual powershell script DeployDatabase.ps1 which is the main powershell script that will be handling all the core logic of connecting to the database and then deploying any SQL scripts to it. These SQL commands could be in a sql file OR can also be inline scripts. Below you can find the content of this script file and hope it helps you guys to implement this functionality quickly. This powershell script should also be part of the devops repository and part of our project itself and only then it will be picked by our YML pipelines.

#parameters to be passed to the powershell script file when it is called in Devops pipeline
param (
		[string]$DBScriptsLocation,
		[string]$DatabaseServerName,
		[string]$DatabaseName,
		[string]$EntraAppClientId,
		[securestring]$EntraAppClientSecret,
		[string]$EntraTenantId,
		[string]$SqlUser,
		[string]$SqlPassword,
		[string]$ConnectionType = "JWT")
$sqlModule = Get-Module -Name SqlServer		#This module is required for handling core SQL related activies like connection, running commands etc
if($sqlModule -eq $null) {
	Write-Host "Install the sql module"
	#This module is used similar to the way we handle SQL actions in classic Microsoft ADO.Net.
	Install-Module -Name SqlServer -Scope CurrentUser - Force -AllowClobber
}
Write-Host "Start DB Deployment"
DeployDatabase -serverName "$DatabaseServerName" -database "$DatabaseName" -dbScriptsFolder "$DBScriptsLocation" -EntraAppClientId "$EntraAppClientId" -EntraAppClientSecret "$EntraAppClientSecret" -EntraTenantId "$EntraTenantId" -SqlUser "$SqlUser" -SqlPassword "$SqlPassword" -ConnectionType "$ConnectionType"
Write-Host "DB Deployment Completed"
#Below is the function to perform read/query/select records from Azure SQL
function ExecuteReader {
	param(
		[System.Data.SqlClient.SQLConnection]$sqlConnection, #this function expects a SQL Connection whether it is using SQl credentials or OAUTH procedure
		[parameter(mandatory=$true)][string]$sqlQuery
		)
		$Datatable = New-Object System.Data.Datatable
		sqlConnection.Open()
		$Command = New-Object System.Data.SqlClient.SQLCommand
		$Command.Connection = $sqlConnection
		$Command.CommandText = $sqlQuery
		$Reader = $Command.ExecuteReader()
		$Datatable.Load($Reader)
		$sqlConnection.Close()
}
#Below function will be used to perform DDL and DML actions on Azure SQL database.
function ExecuteNonQuery 

{
	param(
		[System.Data.SqlClient.SQLConnection]$sqlConnection, #this function expects a SQL Connection whether it is using SQl credentials or OAUTH procedure
		[parameter(mandatory=$true)][string]$sqlQuery
		)
		sqlConnection.Open()
		$Command = New-Object System.Data.SqlClient.SQLCommand
		$Command.Connection = $sqlConnection
		$Command.CommandText = $sqlQuery
		$Command.ExecuteNonQuery()  #this will perform actual action on the SQL database by running our sql commands/scripts
		$sqlConnection.Close()
}
function GetDatabaseConnection

{
	param(
		[string]$serverName,
		[string]$databaseName,
		[string]$EntraAppClientId,
		[securestring]$EntraAppClientSecret,
		[string]$EntraTenantId,
		[string]$SqlUser,
		[string]$SqlPassword,
		[string]$ConnectionType
		)

	$Connection = New-Object System.Data.SqlClient.SQLConnection
	
	if($ConnectionType = "JWT")
	{
		
			#Implement Logic to get JWT token from Microsoft Entra ID for our Application in it. For this we will be using a package or module called MSAL.net available from Microsoft.
			#This can also be handled by calling token endpoint available from MS Entra ID and passing it our Application Client ID and Client secret as part of Request Body. Please check online if you would like to use that approach rather than using this powershell module
			
			$MsalModule = Get-Module -name "MSAL.PS"
			if($MsalModule -eq $null) {	
				Write-Host "Install MSAL Powershell module"
				Set-PSRepository PSGallery -InstallationPolicy  Trusted
				Install-Module -Name MSAL.PS -Scope CurrentUser -Force -AllowClobber
				Install-Module -Name MSAL.PS
			}
		
		Write-Host "Pass the required parameters and Call the function available in this module"
		
		$Scope = "https://database.windows.net/.default"  #we need to get the access token from Entra ID for the database scope since we are getting it to connect to our Azure SQL database
		
		$result = Get-MsalToken -ClientId $EntraAppClientId -ClientSecret $EntraAppClientSecret -TenantId $EntraTenantId -Scopes $Scope
		
		$jwtAccessToken = $result.AccessToken
		
		$Connection.ConnectionString = "Data Source=$($serverName);Initial Catalog=$($databaseName);Connec Timeout=300;"
		$Connection.AccessToken = $jwtAccessToken  #We can pass token as part of connection string as well in case this is something new to know.
	}
	else
	{
		#In case we want to use database credentials to connect to the database
		$Connection.ConnectionString ="Server=$($serverName);Database=$($databaseName);User ID=$($SqlUser);Password=$($SqlPassword);Trusted_Connection=False;"
	}
	
	Write-Host "Connection completed and Returned"
	return $Connection
}

function DeployDatabase {
	param(
	[string]serverName,
	[string]database,
	[string]dbScriptsFolder,
	[string]EntraAppClientId,
	[securestring]EntraAppClientSecret,
	[string]EntraTenantId,
	[string]SqlUser,
	[string]SqlPassword,
	[string]ConnectionType
	)
	try {
	if((Test-Path $dbScriptsFolder) -eq $false)
	{
		throw "Directory has not been that holds the SQL script files"
	}
	$sqlConnection = GetDatabaseConnection -serverName $serverName -databaseName $database -EntraAppClientId $EntraAppClientId -EntraAppClientSecret $EntraAppClientSecret -EntraTenantId $EntraTenantId -SqlUser $SqlUser SqlPassword $SqlPassword -ConnectionType $ConnectionType
	#This function will go through the database script folder - read all the files one by one - and run the sql content by calling ExecuteNonQuery function created earlier
	RunDatabaseQueries  -ScriptsFolder $scriptsFolder -sqlConnection $sqlConnection
	}
	catch
	{
		Write-Error "Database deployment failed with $_.Exception.Message"
	}
}
function RunDatabaseQueries {
	params(
	[System.Data.SqlClient.SQLConnection]sqlConnection,
	[string]ScriptsFolder
	)
	#If we have some inline queries, we can handle them here too as shown below
	$sampleDMLCommand = "Update table Employee set Name='xyz' where Id = 23"
	ExecuteNonQuery -sqlConnection $sqlConnection -sqlQuery $sampleDMLCommand
	#Below explains how we can loop through content of a folder and run each script file one by one in a loop
	$allSqlFiles = Get-ChildItem $scriptsFolder -Filter *.sql #here we have not added any order or sort behavior. if there is a requirement we can use Sort-Object powershell command for that after passing the filter
	try {
		ForEach($file in $allSqlFiles) {
			Write-Host "Read the file content"
			$sqlContent = [IO.File]""ReadAllText("$ScriptsFolder\$file")  #another way to get a full name is using Join-Path -Path $ScriptsFolder -ChildPath $file
			ExecuteNonQuery  -sqlConnection $sqlConnection -sqlQuery $sqlContent
		}
	}
	catch{
		$errors = GetExcetionMessages -exception $_.Exception
		$allErrormessages = $errors -join "; "
		Write-Error "Error occuered while running a sql file $ScriptsFolder\$file with messages as $allErrormessages"
	}
}

function GetExcetionMessages {
	param([Exception]$exception)
	$exceptionMessages = New-Object System.Collections.ArrayList
	$ex = $exception
	while($ex) {
		$exceptionMessages.Add($ex.Message)
		$ex = $ex.InnerException
	}
	return $exceptionMessages
}

Step 6 – Finally trigger the YML pipeline and test overall deployment process.

Additional Complexities that can be there in Database Deployment in a live environment

  • We have have multiple different kind of SQL scripts where some scripts are only to be executed only once and in a specific order. In this case above script should be modified to also add sorting on the files to pick them in certain order which could be datetime of creation OR a custom sorting based on file naming convention.
  • We have also want to maintain sql scripts release history in our database itself and may want to control not them running them again if they have been run with previous releases. In this case, we can add additional logic in the script to first read all the files that have been ran previously by using ExecuteReader function and then using them as a filter clause to not run them again from our SQL scripts folder. Check online how to use filter and exclude files when read a Directory in a file system. With this we gain flexibility to only run any new files checked into the repository folder and not all of them and this is a read usecase for most production applications.
  • In cae there are GO statements inside of your sql files, this script can fail. In that case after reading each file and its content, additionally split queries inside each file based on GO statement as shown below.

Hope you find this post useful and do comment in case you find discrepancies or need additional information.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.