Azure DevOps Pipelines – SQL Server Deploy – One to rule them all
In one of my previous blog posts, I used the SQL Server database deploy task to deploy my DACPAC to SQL Server. Unfortunately, this task became deprecated in Release Pipelines. In this blog post, I would like to share the alternative.
Additionally, we will be moving from a Classic Release pipeline to a YAML pipeline. The YAML pipeline will be responsible for building and deploying our Database Projects.
Quick Overview
Why move to YAML?
The first question you might be asking is, why should we migrate our Release Pipeline to a YAML pipeline? Well, By default, the classic pipelines are disabled. Besides that,the SQL Server Deploy task became deprecated in Release Pipelines.
What are we going to create?
As you will see during the step-by-step example, we will be reusing a lot of code that we have used to create the YAML Build pipeline in one of my previous blog posts, Azure DevOps YAML Pipeline – Build Database Projects
Additionally, we will be adding an approval step, where the deployment responsible needs to approve the deployment to continue after the successful Build process.
To get this done, we will be creating 3 different jobs in our YAML pipeline:
Step-by-step example
As mentioned before, we will be creating a YAML pipeline to manage the build, approval, and deployment process. To get started, we navigate to Azure DevOps. To continue, we go to the Pipelines section and choose Pipelines.
Now, we can create our new pipeline by clicking the New Pipeline button in the right-top corner. As soon as we clicked the New Pipeline button, we need to define where our code is. In this blog post, I will be using my Azure DevOps Repo.
As soon as we have confirmed, we need to specify which repository we want to use. After choosing the right repository, we can configure our pipeline. For now, we will choose the Starter pipeline.
As a result, we get a template YAML script which looks like this:
Since we will be taking a different approach, we need to remove the lines after the trigger section. We are now completely set to configure our Build process.
To continue, we first need to define that we will be using jobs. To get this done, we use the following syntax:
jobs:
Build
Now that we have defined that we are using jobs, we can configure our first job, which will be used to execute our Build process. As configuration, we will be defining the job name, and configuring which agent pool we want to use to execute the job.
We use the following syntax for the configuration:
- job: BuildProcess # This is the technical name of the job
displayName: Build Process # The readable name which will be shown
pool: 'On-Premise Agent Pool' # The pool to use to execute the job
steps:
Important to know is that everything after the #-sign is just documentation that I wrote.
If we use the code that I wrote above, we successfully define our Build Process job which will be running on my ‘On-Premise Agent Pool’. To deploy to SQL Server, we require an Azure DevOps Agent on our machine. If you are wondering how to do this, you can follow the following blog post: SQL Server Database Deployment Automation with Azure DevOps
As a next step, we start to configure the required steps to build our Database Project.
MSBuild Task
As a first task, we want to build our Database Project. To do this, we use the MSBuild task. If you want, you can use the assistant in Azure DevOps, which you can find on the right.
If you use the assistant, search for the MSBuild task, execute the required configuration and add it to the YAML file. I will be using YAML directly.
- task: MSBuild@1 #MSBuild task
inputs:
solution: '**/*.sqlproj' #build the sqlproj file
msbuildArchitecture: 'x64'
As you can see, we are searching for .sqlproj files in the repository. We changed from .sln to .sqlproj because the .sln file is not available when you create a Database Project in Azure Data Studio.
Now that we successfully created the Build task, we continue by creating a Copy Files task and a Publish Pipeline Artifact task.
Copy Files Task
We will look at the Copy Files task first, if you want to, feel free to use the assistant, we are going to configure the Copy Files task.
# Execute the Copy Files Task
- task: CopyFiles@2
inputs:
# Configure where the files are, in this example we use the Default Build Directory
SourceFolder: '$(agent.builddirectory)'
# Define which content you want to copy
# If you do not specify, all files will be included in the Artifact later.
Contents: '**\*.dacpac'
TargetFolder: '$(build.artifactstagingdirectory)'
As you can see in the code above, we are only copying the .dacpac files that are available after the Database Project Build.
Publish Pipeline Artifact Task
Now, we can finalize our first job by configuring the Publish Pipeline Artifact task.
# Create an artifact from the files and publish the Artifact
- task: PublishPipelineArtifact@1
inputs:
# Define the folder for which you want to create and publish an Artifact
targetPath: '$(build.artifactstagingdirectory)'
# Define the name of your Artifact, this will be used in the deployment pipeline afterwards
artifact: 'prd-drop-dacpac'
The most important thing we need to configure is how we would like to name our artifact, in my case I’m giving it the name prd-drop-dacpac since this pipeline will build the .sqlproj files in my master branch and I’m only publishing the dacpac files.
You can save your YAML Pipeline and run it for the first time to validate if everything was configured properly.
Approval
Now that we have successfully created the job for the Build Process, we can create our Approval Process. Before we get started, I’m adding an approval step here since I want to be able to deploy when I want and not necessarily when a new change has been merged to my master branch. If I would be creating a Pipeline to deploy to my development environment, I wouldn’t bother to include the Approval step.
For the Approval step, we will first be configuring a new job in the existing pipeline. To get this done we will be using the following syntax (as we have used before):
- job: WaitForApproval
displayName: Wait For Approval
pool: Server
dependsOn: BuildProcess
steps:
Important to notice in the small code sample above is that we are using a different pool to execute this job. We create an “agentless job” because the Manual Validation Task only can be executed using this kind of job. By using the keyword Server as pool we indicate that we want to run the job as an agentless job.
We need to add the depensOn property as well. To make sure we can approve as soon as the Artifact is created and published.
Now we can add the Manual Validation task to simulate our approval step. In this step, we will be identifying who needs to approve before we can deploy the new version of our Database Project.
- task: ManualValidation@0
inputs:
notifyUsers: 'olivier@oliviervs.be'
instructions: |
Hi
New changes are waiting for your approval before they can be deployed.
Kind regards
onTimeout: reject
As you can see, I used my own email address to be notified if a new version is waiting for my approval. In case you don’t want to notify anyone, you can define an empty string ( ” ).
Deploy
As a final step, we are going to add our Deployment process as a separate job to our YAML pipeline. So we first define our job.
- job: DeployToTarget
displayName: Deploy Changes To Target
pool: 'On-Premise Agent Pool'
dependsOn: WaitForApproval
steps:
- checkout: none
As you can see in the code above, I have added a couple of things. First, we need to specify the pool we would like to use, right now, it is important to use the pool which has access to the target server.
Secondly, we add a dependency to the approval job. We don’t want to deploy before the artifact approval.
Finally, I would recommend adding the checkout: none syntax as well. If you don’t add this, it will automatically check out your repository again. To avoid that, we add the checkout: none syntax.
Download Pipeline Artifact
Before we can deploy the .dacpac to SQL Server, we need to download the artifact that we have created earlier. To get this done, we add a Download Pipeline Artifact task.
- task: DownloadPipelineArtifact@2
inputs:
buildType: 'current'
targetPath: '$(Pipeline.Workspace)'
As you can see, we don’t need to specify much. The most important thing to specify is the buildType. This defines that we want to download the artifact created in this build.
Deploy to SQL Server
As a final step in our process, we want to deploy our changes to SQL Server. To deploy a dacpac to SQL Server, we are using SqlPackage. If you would like to have more info on SqlPackage, this link should provide plenty of useful information: SqlPackage – SQL Server | Microsoft Learn
To use SqlPackage, we will be adding a CmdLine Task. Which will be replacing the SQL Server database deploy task that we used in Release pipelines.
- task: CmdLine@2
inputs:
script: '"C:\Program Files\Microsoft SQL Server\160\DAC\bin\SqlPackage.exe" /Action:Publish /SourceFile:"C:\agent\_work\2\prd-drop-dacpac\s\AdventureWorks\bin\Debug\AdventureWorksLT.dacpac" /TargetServerName:localhost /TargetDatabaseName:AdventureWorksLT2019 /TargetEncryptConnection:False'
The first step to use SqlPackage, Is to determine where this is located on the agent which you are using to deploy. The SqlPackage utility is probably installed in the following folder:
C:\Program Files\Microsoft SQL Server\160\DAC\bin\SqlPackage.exe
As soon as we have identified where the SqlPackage utility is located, we can specify the action, which dacpac to deploy, the target server and target database.
If something is going wrong with your code, you can find the full yaml file on my GitHub page: blogging/Azure DevOps/Pipelines/cicd-sqlserver-one-to-rule-them-all.yml
Using this now. Thank you!
That’s wonderful to hear, if you are looking for any kind of extensions, just reach out.
It helped me to succesfully deploy on SQL Server On-Prem. Thanks.
I don’t understand though why we need these tasks : copyFiles, publish and download, Since we call SqlPackage.exe with an hardcoded /SourceFile parameters ?
We could execute SqlPackage right after @VSBuild and specify the .dacpac files in the work folder ?
Hi,
That is a great question. And, to be fair I never tried to execute it in that way.
In general, I always want to separate my build and deployment actions. (CI and CD)
By publishing my build result, I can always access the result later as well.
If you only use the temporary result in the working directory, you will lose that result as soon as a new build starts and won’t be able to check what has been used to deploy if required.