9

Big Picture: We are using Azure DevOps release process (so far we create steps in designer, not YAML pipelines). We release to 3 environments with 3 different databases. Part of the release is deploy database from DACPAC file. Since using SqlPackage.exe to publish directly to database is not very transparent (you don't see and review the actual SQL script), we wanted to do release in 2 stages:

  1. Create SQL script from DACPAC and review it
  2. After approval run app and db deploy from previously generated script.

Issue: How to share sql script file between stages and how to see it for approval. Stages can be triggered on different agent.

What I've tried:

  1. Publishing script as build artifact - this won't work cause to generate script I need to connect to database and connecting to it should not be part of build process, especially connecting to production database.
  2. Publish artifact to Azure Pipelines as release step - it's not allowed on release, only for builds Publish
  3. Publish artifact to file share - I'm not sure how this exactly work, documentation is not very well done. Moreover regular windows file sharing would be difficult to set in our infrastructure, I would rather avoid it.

Any other suggestions?

Mateusz Moska
  • 1,921
  • 1
  • 14
  • 18

2 Answers2

2

While you can't use pipeline artefacts, you could use Universal Packages in Package Management to publish arbitrary files for later retrieval. Just dumping the file contents to the pipeline logs is the easiest way to allow people to inspect it.

You could also create a file with placeholders as a build artefact and merge in the final settings from Pipeline variables in each stage, that way you can keep them as a build artefact. That's what I tend to do for any files of this nature. Sounds like this won't apply for your generated SQL file.

Alternatively, if the "seeing for approval" piece is important you could generate it and write it to the log, upload it to Universal Package Management. Then ask for approval at the end of the stage. In the next stage, you then download the script from Universal Package Management or you regenerate it using the exact same task configuration before execution.

jessehouwing
  • 106,458
  • 22
  • 256
  • 341
  • 1
    I'm told that future versions of the "Publish Pipeline Artefact" is likely going to enable the scenario we all want. – jessehouwing Jan 28 '19 at 10:50
  • Thanks for the idea, we finally did workaround with sending script to SFTP server, where we can access and review it and then in second stage we downloading the script and running.But ideal would be to publish artefact – Mateusz Moska Jan 29 '19 at 10:51
  • 1
    Yeah Universal Package Management would allow a similar thing. Do you check the sha of the script after retrieving it from SFTP? Otherwise, you may end up consuming a different script than generated. Security seems important in your scenario ;). – jessehouwing Jan 29 '19 at 11:22
  • Again, thanks for the idea :) you're right we should do check SHA sum, at the moment we are not doing it. – Mateusz Moska Jan 29 '19 at 11:41
0

Azure DevOps now allows to download files that have been published as artifacts in earlier stages of the pipeline.

In the code I am sharing, I have implemented SQL schema generation with database changes, followed by publishing those changes (after approval). Some remarks:

  • This won't work if the changes will cause data loss.
  • The path for the sqlpackage is only correct when Visual Studio 2019 is installed, like in windows-2019 images.
  • I passed the following variables through a group variable (more information on how to create group variables here):
    • targetDBConnectionString
    • servername
    • databasename
    • adminlogin
    • adminPassword
  • I have added an approval to the ApplyChanges stage (within Pipelines menu, choose environments, then the ApplyChanges environment, and then approvals and checks from the three dots button, on the top right corner). This way the changes are not applied to the database before manual approval takes place.

stages:
- stage: 'Build'
  displayName: 'Build the web application'
  jobs: 

  (...)

  - job: BuildDacpac
    pool:
      vmImage: 'windows-2019'
    steps:
    - task: DotNetCoreCLI@2
      displayName: 'Build the database project'
      inputs:
        command: 'custom'
        custom: 'build'
        projects: 'FQDN\For\The\DB\Project\DBProjectName.sqlproj'
    - task: CopyFiles@2
      displayName: 'Copy dacpac file to a staging directory'
      inputs:
        contents: |
          FQDN\For\The\DB\Project\bin\**\*.dacpac
        targetFolder: '$(Build.StagingDirectory)'
    - task: PublishBuildArtifacts@1
      displayName: 'Publish build artifact'
      inputs:
        pathToPublish: '$(Build.ArtifactStagingDirectory)'
        artifactName: dropDacpac
      condition: succeededOrFailed()

- stage: VerifyScript
      displayName: 'Script database schema changes'
      dependsOn: 
        - Build
      jobs:
      - deployment: VerifyScript
        pool:
          vmImage: 'windows-2019'
        variables:
        - group: 'Timeline CV - Release'
        environment: 'scriptverification'
        strategy:
          runOnce:
            deploy:
              steps:
              - download: current
                artifact: dropDacpac
                patterns: '**/*'

          - task: CmdLine@2
            displayName: 'Generate schema changes script'
            inputs:
              script: |
                "c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140\sqlpackage.exe"    ^
                /action:script ^
                /diagnostics:true ^
                /sourcefile:$(Pipeline.Workspace)\dropDacpac\path\to\the\dacpacFile\package.dacpac    ^
                /targetConnectionString:$(targetDBConnectionString) ^
                /outputpath:$(Build.StagingDirectory)\changesScript.sql

          - task: PublishPipelineArtifact@1
            inputs:
              targetPath: '$(Build.StagingDirectory)'
              artifactName: dropSqlSchemaChangesScript
            condition: succeededOrFailed()

          - task: PowerShell@2
            displayName: Show Auto Generated SQL Script
            inputs: 
              targetType: 'inline'
              script: | 
                Write-Host "Auto Generated SQL Update Script:"
                Get-Content $(Build.StagingDirectory)\changesScript.sql | foreach {Write-Output      $_}

- stage: ApplyChanges
  displayName: 'Apply database schema changes'
  dependsOn: VerifyScript
  jobs:
  - deployment: ApplyChanges
    pool:
      vmImage: 'windows-2019'
    variables:
    - group: 'Timeline CV - Release'
    environment: 'applyChanges'
    strategy:
      runOnce:
        deploy:
          steps:
          - download: current
            artifact: dropSqlSchemaChangesScript
          - task: SqlDacpacDeploymentOnMachineGroup@0
            displayName: 'Deploy SQL schema changes script'
            inputs:
              taskType: 'sqlQuery'
              sqlFile: '$(Pipeline.Workspace)\dropSqlSchemaChangesScript\changesScript.sql'
              targetMethod: 'server'
              authScheme: 'sqlServerAuthentication'
              serverName: '$(servername)'
              databaseName: '$(databasename)'
              sqlUsername: '$(adminlogin)'
              sqlPassword: '$(adminPassword)'
ccoutinho
  • 3,308
  • 5
  • 39
  • 47
  • will this work with a **release pipeline**? In my release pipeline the `$(Build.ArtifactStagingDirectory)` is not set. – Saurabh Nanda Jul 02 '21 at 05:30
  • Have you tried the Download Pipeline Artifact task, in the release pipelines (classic view)? I believe it should be possible. Check this: https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/pipeline-artifacts?view=azure-devops&tabs=classic – ccoutinho Jul 02 '21 at 13:58
  • I get the following error: "Cannot upload to a pipeline artifact from release environment." – Saurabh Nanda Jul 05 '21 at 04:36
  • It seems that your problem is with uploading/publishing the artifact, and not with downloading it. If it makes sense in your use case, I recommend you to create a build pipeline, that finishes by publishing the artifact. And then the release pipeline you're working on would be triggered whenever an artifact is published in the build pipeline – ccoutinho Jul 06 '21 at 07:45