19

I currently have a monorepo with services in subdirectories that I'm leaning towards turning into a multirepo with a metarepo.

One of the reasons I decided to give Azure DevOps a try was someone told me you can have triggers on the subdirectories like:

trigger:
  branches:
    include:
    - master
  paths:
    include:
    - client

Tested and it works.

However, what I'm wondering is if it possible to have multiple independent triggers, or does this require either a polyrepo or multiple .yml? The reason being if there are only changes in the client service, it only triggers that set of tests, build, and deployment, while not triggering the api service to run tests, build, and deploy.

For example:

trigger:
  branches:
    include:
    - master
  paths:
    include:
    - client

  stages:
    ...
    Run tests
    If tests pass, build and push to ACR
    Deploy to AKS
    ...

trigger:
  branches:
    include:
    - master
  paths:
    include:
    - api

  stages:
    ...
    Run tests
    If tests pass, build and push to ACR
    Deploy to AKS
    ...

That way, changes in one doesn't cause the entire application to be rebuilt, just what changed.

However, does this require multiple .yml files (not even sure if anything other than azure-pipelines.yml is recognized), does this necessitate a polyrepo, or is this doable in a single azure-pipelines.yml that I'm just not seeing?

cjones
  • 8,384
  • 17
  • 81
  • 175
  • 1
    Check this out https://learn.microsoft.com/en-us/azure/devops/pipelines/test/test-impact-analysis?view=azure-devops&viewFallbackFrom=vsts Yes, you can have multiple .yml files with any name really. When you create the build pipeline, you pick and reference the existing yaml. There's no easy way to trigger tests based on changes in a certain git directory. What you are referencing is build task conditionals. But there are no built in variables that you can utilize as a conditional. There might be some complex API operations you could put together by hitting the GIT api. – Anthony Klotz Jan 03 '20 at 22:31

3 Answers3

21

If I understand your request correctly, you can achieve this in a single azure-pipeline.yml. Please check below example yml.

trigger:
  branches:
    include:
    - master
  paths:
    include:
    - client/*
    - api/*

jobs:
- job: getchangepath
  pool:
    vmImage: 'windows-latest'
  steps: 
  - powershell: |
      $url="$(System.CollectionUri)/$(System.TeamProject)/_apis/git/repositories/$(Build.Repository.ID)/commits/$(Build.SourceVersion)/changes?api-version=5.1"
      $result = Invoke-RestMethod -Uri $url -Headers @{Authorization = "Bearer $(System.AccessToken)"} -Method GET
                       
      $changesFolder = $result.changes | Where-Object{$_.item.gitObjectType -match "tree"} | Select-Object -Property {$_.item.path}
     
      foreach($path in $changesFolder){
        if($path -match '/client'){
          echo "##vso[task.setvariable variable=Client;isOutput=true]$True"
          break
        }
      }

      foreach($path in $changesFolder){
        if($path -match '/api'){
          echo "##vso[task.setvariable variable=Api;isOutput=true]$True"
          break
        }
      }
    name: MyVariable

- job: client
  pool :
    vmImage: 'windows-latest'
  dependsOn: getchangepath
  condition: eq(dependencies.getchangepath.outputs['Myvariable.Client'], 'true')
  steps:
  - powershell: echo 'client job start'
  
- job: api
  pool :
    vmImage: 'windows-latest'
  dependsOn: getchangepath
  condition: eq(dependencies.getchangepath.outputs['Myvariable.Api'], 'true')
  steps:
  - powershell: echo 'api job start'

In the above yml I have three jobs. In the first job getchangepath I call git get changes Rest API in the PowerShell task to get the changed path which triggered the build. It also outputs the variables if the path contains path /client or /api.

Job client and job api depend on job getchangepath and will be executed on the condition of the output variable in job getchangepath.

Suppose I changed a file in folder client and commit the change to Azure Repo. Then after job getchangepath is finished. MyVariable.Client will be set to true. Then Job client will evaluate its condition and get started. Job Api condtion will be false, and gets skipped.

David Gardiner
  • 16,892
  • 20
  • 80
  • 117
Levi Lu-MSFT
  • 27,483
  • 2
  • 31
  • 43
11

I recently faced this problem. You don't need to hard-code and access the DevOps API and PowerShell code in the solution above.

Here is a simpler solution using out of the box YAML and the workingDirectory property per the official Azure DevOps documentation.

Setup a project structure like this, with each repository having it's own YAML file:

.
├── README.md
├── azure-pipelines.yml
├── service-a
|── azure-pipelines-a.yml
│   └── …
└── service-b
        |── azure-pipelines-b.yml
        └── …

You might not need a root pipeline, but if you do, you will want to ignore the sub-projects:

# Excerpt from /azure-pipeline.yml 

trigger:
  paths:
    exclude: # Exclude!
      - 'service-a/*'
      - 'service-b/*'

And in the sub-projects, you want them to pay attention to themselves:

# Excerpt from /service-a/azure-pipeline-a.yml

trigger:
  paths:
    include: # Include!
      - 'service-a/*' # or 'service-b/*'

Caveat - Working directories!

Your sub-project pipelines are still running with the root as your working directory. You can change this using the workingDirectory key, for example (which uses a variable to avoid repeat):

variables:
  - name: working-dir
    value: 'service-b/'

steps:
- script: npm install
  workingDirectory: $(working-dir)

- script: npm run task
  workingDirectory: $(working-dir)

If your projects share steps, you should use Pipeline Templates (in another repository) per official docs instead.

julie-ng
  • 500
  • 6
  • 11
  • 1
    But you still need to create multiple pipelines and specify each yml files? Using the PS script, you can manage all of that in one yml pipeline. Am I correct? – Ceros Aug 18 '20 at 15:52
  • 1
    Yes, you need separate pipelines per directory. Which one is better will depend on size and complexity of your project. Personally I think such scripts make debugging difficult. I would decouple with multiple pipelines and use pipeline templates for repetitive tasks as needed. – julie-ng Nov 14 '20 at 07:38
  • 1
    @julie-ng I've tried this approach and works well thanks for this great example. However the pipelines I've setup inside service-a and service-b do not get triggered when some changes are made within these folders only the root pipeline gets triggered. Is there a way to get the service-a, service-b pipelines triggered automatically? Any suggestions? – giri-jeedigunta Feb 27 '21 at 14:16
0

If you get 203 response with the above method you can try this way

$userName = "whatever"
$AuthInfo = [Conver]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $userName,$(PAT)))
$result = Invoke-RestMethod -Uri $url -Headers @{Authorization = "Basic {0} $AuthInfo" } -Method GET

I ran this on powershell it works fine.

Captain_Levi
  • 87
  • 1
  • 5