136

When making changes to YAML-defined Azure DevOps Pipelines, it can be quite tedious to push changes to a branch just to see the build fail with a parsing error (valid YAML, but invalid pipeline definition) and then try to trial-and-error fix the problem.

It would be nice if the feedback loop could be made shorter, by analyzing and validating the pipeline definition locally; basically a linter with knowledge about the various resources etc that can be defined in an Azure pipline. However, I haven't been able to find any tool that does this.

Is there such a tool somewhere?

riQQ
  • 9,878
  • 7
  • 49
  • 66
Tomas Aschan
  • 58,548
  • 56
  • 243
  • 402

6 Answers6

44

UPDATE: This functionality was removed in Issue #2479 in Oct, 2019


You can run the Azure DevOps agent locally with its YAML testing feature.

  1. From the microsoft/azure-pipelines-agent project, to install an agent on your local machine.
  2. Then use the docs page on Run local (internal only) to access the feature that is available within the agent.

This should get you very close to the type of feedback you would expect.

KyleMit
  • 30,350
  • 66
  • 462
  • 664
Jamie
  • 3,094
  • 1
  • 18
  • 28
  • 2
    Thanks! That's a little overkill for me (since it seems to require installing the agent locally; I didn't manage to make it work by just stuff in the repository...). However, this incantation, could I get it to work, seems like it would do exactly what I'm after: `./run.sh --yaml --what-if`. I'll file an issue with them to see if they can start publishing that functionality as a global dotnet tool or something. – Tomas Aschan Oct 30 '18 at 09:14
  • 2
    Is this GA yet? The [linked page](https://github.com/Microsoft/azure-pipelines-agent/blob/d504d4ef0e82d34439aa0eb494f218c2d9146a95/docs/preview/yamlgettingstarted-runlocal.md) suggests that it's internal only. – Tom Wright Sep 09 '19 at 16:42
  • 2
    The [latest version](https://github.com/microsoft/azure-pipelines-agent/blob/fe5a24a1d7e93724c4849295a8200804cf2ec301/docs/preview/outdated/yamlgettingstarted-localrun.md) has been commented out - the [raw document](https://raw.githubusercontent.com/microsoft/azure-pipelines-agent/fe5a24a1d7e93724c4849295a8200804cf2ec301/docs/preview/outdated/yamlgettingstarted-localrun.md) indicates that it was still in flux at the time it was changed (2018-06-24) – Chris Hunt Sep 28 '19 at 15:40
28

FYI - this feature has been removed in Issue #2479 - remove references to "local run" feature

Hopefully they'll bring it back later considering Github Actions has the ability to run actions locally

KyleMit
  • 30,350
  • 66
  • 462
  • 664
John Goodwin
  • 341
  • 3
  • 6
  • That's not really true. act is a third-party project with absolutely no support or consideration from Github, and it has plenty gotchas and differences from the official GHA runner environment. GHA has no ability to run actions locally, you merely have the ability to hack something fairly similar to run locally. – mathrick Jul 19 '23 at 16:26
16

Azure DevOps has provided a run preview api endpoint that takes a yaml override and returns the expanded yaml. I added this support to the AzurePipelinePS powershell module. The command below will execute the pipeline with the id of 01 but with my yaml override and return the expanded yaml pipeline.

Preview - Preview Service: Pipelines API Version: 6.1-preview.1 Queues a dry run of the pipeline and returns an object containing the final yaml.

# AzurePipelinesPS session
$session = 'myAPSessionName'

# Path to my local yaml
$path = ".\extension.yml"    

# The id of an existing pipeline in my project
$id = 01        
        
# The master branch of my repository
$resources = @{              
   repositories = @{
       self = @{
           refName = 'refs/heads/master'
        }
   }
}

Test-APPipelineYaml -Session $session -FullName $path -PipelineId $id -Resources 
$resources
Dejulia489
  • 1,165
  • 8
  • 14
  • 1
    How can you use this API with templates? You can only submit one override file so that must be the final version? – user14492 Apr 20 '22 at 14:43
  • 1
    I commit my templates into a feature branch. Update my extension repository resource to use the feature branch and then pass the extension to this endpoint as the override yaml. – Dejulia489 Apr 20 '22 at 15:31
10

A pipeline described with YAML, and YAML can be validated if you have a schema with rules on how that YAML file should be composed. It will work as short feedback for the case you described, especially for syntax parsing errors. YAML Schema validation might be available for almost any IDE. So, we need:

  1. YAML Schema - against what we will validate our pipelines
  2. An IDE (VS Code as a popular example) - which will perform validation on the fly
  3. Configure two of the above to work together for the greater good

The schema might be found from many places, for this case, I'll suggest using https://www.schemastore.org/json/ It has Azure Pipelines schema (this schema contains some issues, like different types of values comparing to Microsoft documentation, but still cover the case of invalid syntax)

VS Code will require an additional plug-in to perform YAML text validation, there are also a bunch of those, who can validate schema. I'll suggest try YAML from RedHat (I know, the rating of the plugin is not the best, but it works for the validation and is also configurable)

In the settings of that VS Code plugin, you will see a section about validation (like on screenshot)

YAML Schema validation plugin settings

Now you can add to the settings required schema, even without downloading it to your machine:

    "yaml.schemas": {
    
    "https://raw.githubusercontent.com/microsoft/azure-pipelines-vscode/v1.174.2/service-schema.json" : "/*"
}

Simply save settings and restart your VS Code. You will notice warnings about issues in your YAML Azure DevOps Pipeline files (if there is any). Failed validation for purpose on the screenshot below:

enter image description here

See more details with examples here as well

Sysanin
  • 1,501
  • 20
  • 27
5

I can tell you how we manage this disconnect.

We use only pipeline-as-code, yaml.

We use ZERO yaml templates and strictly enforce one-file-pr-pipeline.

We use the azure yaml extension to vscode, to get linter-like behaviour in the editor.

Most of the actual things we do in the pipelines, we do by invoking PowerShell, that via sensible defaulting also can be invoked in the CLI, meaning we in essence can execute anything relevant locally.

Exceptions are Configurations of the agent - and actual pipeline-only stuff, such as download-artifact tasks and publish tasks etc.

Let me give some examples:

Here we have the step that builds our FrontEnd components: enter image description here

Here we have that step running in the CLI:

enter image description here

I wont post a screenshot of the actual pipeline run, because it would take to long to sanitize it, but it basically is the same, plus some more trace information, provided by the run.ps1 call-wrapper.

Casper Leon Nielsen
  • 2,528
  • 1
  • 28
  • 37
  • 1
    How do you manage the dependencies line Node, .NET SDK, Maven, etc? – Liero Dec 06 '21 at 08:17
  • When we f.ex "build-frontend", in that function is will assert, meaning it will fail if condition is not met, that node and npm are in the predefined versions. In the pipeline we use an install-node task In the cli we just assert it, and the developer will nvm install locally – Casper Leon Nielsen Dec 09 '21 at 12:33
  • 1
    The correct way, in 2021, ofc is to wrap it all up in a container, but if you have not made the jump yet, this is the way – Casper Leon Nielsen Dec 09 '21 at 12:33
  • 1
    Same here. Our YAML files are minimal, 1 main task + upload artifacts task for the build stage, and download artifacts + 1 main task for each deployment stage. Environment variables determine behaviour (e.g. `ARTIFACTS_PATH`) with sensible defaults for running locally. We have a mono repo, so the PowerShell module is checked out along with the application/library code. Benefits: unit tested, runs locally, ability to inspect (intermediate) artifacts, iterate quickly, limits vendor lock-in. Downsides: need to know PowerShell and maintain it ourselves (incl. documentation). – Michiel van Oosterhout May 03 '22 at 15:23
4

Such tool does not exists at the moment - There are a couple existing issues in their feedback channels:

As a workaround - you can install azure devops build agent on your own machine, register as its own build pool and use it for building and validating yaml file correctness. See Jamie's answer in this thread

Of course this would mean that you will need to constantly switch between official build agents and your own build pool which is not good. Also if someone accidentally pushes some change via your own machine - you can suffer from all kind of problems, which can occur in normal build machine. (Like ui prompts, running hostile code on your own machine, and so on - hostile code could be even unintended virus infection because of 3rd party executable execution).

There are two approaches which you can take:

  1. Use cake (frosten) to perform build locally as well as perform building on Azure Devops.
  2. Use powershell to perform build locally as well as on Azure Devops.

Generally 1 versus 2 - 1 has more mechanics built-in, like publishing on Azure devops (supporting also other build system providers, like github actions, and so on...).

(I by myself would propose using 1st alternative)

As for 1: Read for example following links to have slightly better understanding:

Search for existing projects using "Cake.Frosting" on github to get some understanding how those projects works.

As for 2: it's possible to use powershell syntax to maximize the functionality done on build side and minimize functionality done on azure devops.

parameters:
  - name: publish
    type: boolean
    default: true

parameters:
  - name: noincremental
    type: boolean
    default: false

...

      - task: PowerShell@2
        displayName: invoke build
        inputs:
          targetType: 'inline'
          script: |
            # Mimic build machine
            #$env:USERNAME = 'builder'

            # Backup this script if need to troubleshoot it later on
            $scriptDir = "$(Split-Path -parent $MyInvocation.MyCommand.Definition)"
            $scriptPath = [System.IO.Path]::Combine($scriptDir, $MyInvocation.MyCommand.Name)
            $tempFile = [System.IO.Path]::Combine([System.Environment]::CurrentDirectory, 'lastRun.ps1')
            if($scriptPath -ne $tempFile)
            {
                Copy-Item $scriptPath -Destination $tempFile
            }

            ./build.ps1 'build;pack' -nuget_servers @{
                'servername' = @{
                    'url' = "https://..."
                    'pat' = '$(System.AccessToken)'
                }

                'servername2' = @{
                    'url' = 'https://...'
                    'publish_key' = '$(ServerSecretPublishKey)'
                }
            } `
            -b $(Build.SourceBranchName) `
            -addoperations publish=${{parameters.publish}};noincremental=${{parameters.noincremental}}

And on build.ps1 then handle all parameters as seems to be necessary.

param ( 

    # Can add operations using simple command line like this: 
    #   build a -add_operations c=true,d=true,e=false -v
    # => 
    #   a c d
    #
    [string] $addoperations = ''
)

...

foreach ($operationToAdd in $addoperations.Split(";,"))
{
    if($operationToAdd.Length -eq 0)
    {
        continue
    }

    $keyValue = $operationToAdd.Split("=")

    if($keyValue.Length -ne 2)
    {
        "Ignoring command line parameter '$operationToAdd'"
        continue
    }

    if([System.Convert]::ToBoolean($keyValue[1]))
    {
        $operationsToPerform = $operationsToPerform + $keyValue[0];
    }
}

This will allow to run all the same operations on your own machine, locally and minimize amount of yaml file content.

Please notice that I have added also last execution .ps1 script copying as lastRun.ps1 file.

You can use it after build if you see some non reproducible problem - but you want to run same command on your own machine to test it.

You can use ` character to continue ps1 execution on next line, or in case it's complex structure already (e.g. @{) - it can be continued as it's.

But even thus yaml syntax is minimized, it still needs to be tested - if you want different build phases and multiple build machines in use. One approach is to have special kind of argument -noop, which does not perform any operation - but will only print what was intended to be executed. This way you can run your pipeline in no time and check that everything what was planned to be executed - will gets executed.

TarmoPikaro
  • 4,723
  • 2
  • 50
  • 62