19

In an Azure DevOps pipeline template, I am declaring a parameter as an array/sequence

parameters:
  mySubscription: ''
  myArray: []

steps:
- AzureCLI@2
  inputs:
    azureSubscription: ${{ parameters.mySubscription }}
    scriptType: pscore
    scriptPath: $(Build.SourcesDirectory)/script.ps1
    arguments: '-MyYAMLArgument ${{ parameters.myArray }}'

Value for the parameter is then passed from pipeline definition as

steps:
- template: myTemplate.yml
  parameters:
    mySubscription: 'azure-connection'
    myArray:
    - field1: 'a'
      field2: 'b'
    - field1: 'aa'
      field2: 'bb'

My problem is I can't pass that array as-is in YAML syntax (kind of ToString()) to be able to consume and treat that array from PowerShell in my template. When trying to run this pipeline, I get the following error: /myTemplate.yml (Line: X, Col: X): Unable to convert from Array to String. Value: Array. The line/column referenced in the error message correspond to arguments: '-MyYAMLArgument ${{ parameters.myArray }}' from my template.

I also tried to map the parameter as an environment for my script

- AzureCLI@2
  inputs:
    azureSubscription: ${{ parameters.mySubscription }}
    scriptType: pscore
    scriptPath: $(Build.SourcesDirectory)/script.ps1
    arguments: '-MyYAMLArgument $Env:MY_ENV_VAR'
  env:
    MY_ENV_VAR: ${{ parameters.myArray }}

This does not work too: /myTemplate.yml (Line: X, Col: Y): A sequence was not expected. That time line/column refers to MY_ENV_VAR: ${{ parameters.myArray }}.

Does anyone ever faced a similar requirement to pass complex types (here an array/sequence of object) defined from the pipeline definition to a PowerShell script? If so, how did you achieve it?

GGirard
  • 1,145
  • 1
  • 13
  • 33

6 Answers6

16

You can now convert these types of parameters to String using the convertToJson function in an ADO pipeline:

parameters:
  - name: myParameter
    type: object
    default:
        name1: value1
        name2: value2

...

- task: Bash@3
  inputs:
    targetType: inline
    script: |
      echo "${{ convertToJson(parameters.myParameter) }}"

ref: https://developercommunity.visualstudio.com/t/allow-type-casting-or-expression-function-from-yam/880210

convertToJson: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops#converttojson

Ed Randall
  • 6,887
  • 2
  • 50
  • 45
  • 1
    This is great! I did run into an issue if you attempt to pass this as a parameter to a PowerShell script as an argument where the entire string was not coming through. The workaround was to set it as an environment variable to the script instead. – SignalRichard Apr 22 '21 at 13:01
  • We have to be careful using undocumented functions - Hopefully this will become mainstream, but it could just as easily vanish overnight! – Ed Randall Apr 23 '21 at 14:43
  • One approach might be to replace `Bash@3` wth `PythonScript@0` and use the json library to cleanly unpack the parameters into python variables. – Ed Randall Aug 03 '21 at 09:58
  • 1
    Thanks! it didn't work to me as is for Powershell - seems that convertToJson adds new lines which Powershell script doesn't like. See in my answer how I solved it with ConvertFrom-Json Powershell function. BTW convertToJson is already documented here: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/expressions?view=azure-devops – MaMazav Aug 13 '21 at 14:10
  • Edited now that convertToJson is now mainstream+documented -thanks @MaMazav – Ed Randall Aug 16 '21 at 14:34
  • 1
    unfortunately convertToJson does not know of possible ##vso[] commands. I use replace(input, search, replaceWith) for that. From the same link as in this answer. So when you have jobs parameter/var with script task inline command having a vso command, it can get rendered if you log it and then getting parsed as a real vso command... oops. But that is not in the question though, just want to mention it. – Michael Jan 26 '23 at 18:50
8

Based on convertToJson idea by @ed-randall, together with ConvertFrom-Json Powershell function, we can use a JSON 'contract' to pass values between yaml and PS script:

- powershell: |
    $myArray = '${{ convertToJson(parameters.myArray) }}' | ConvertFrom-Json
    ...
MaMazav
  • 1,773
  • 3
  • 19
  • 33
5

I'm facing a similar problem. My workaround is to flatten the array into a string using different separators for different dimensions.

For example I want to make some parameters required and fail the build if these parameters are not passed. Instead of adding a task for every parameter to check, I want to do this in a single task.

To do this, I first pass, as a parameter (to another template, called check-required-params.yml which holds the task responsible for checking the parameters), an array where each element is a string of the type name:value which is a concatenation (using the format expression) of the name and the value of required parameters separated by a colon:

# templates/pipeline-template.yml
parameters:
- name: endpoint
  type: string
  default: ''
- name: rootDirectory
  type: string
  default: $(Pipeline.Workspace)
- name: remoteDirectory
  type: string
  default: '/'
- name: archiveName
  type: string
  default: ''
    
#other stuff

      - template: check-required-params.yml
        parameters:
          requiredParams:
          - ${{ format('endpoint:{0}', parameters.endpont) }}
          - ${{ format('archiveName:{0}', parameters.archiveName) }}

Then in check-required-params.yml, I join the array separating the elements with a semicolon using the expression ${{ join(';', parameters.requiredParams) }}. This creates a string of the type endpoint:value;archiveName:value and passes this as an environmental variable.

At this point, using a little string manipulation, in a script I can split the string using the semicolon as a separator so I will get an array of strings like name:value which I can further split, this time using a colon as the separator. My check-required-params.yml looks like:

# templates/check-required-params.yml
parameters:
- name: requiredParams
  type: object
  default: []    
        
steps:
- task: PowerShell@2
  env:
    REQURED_PARAMS: ${{ join(';', parameters.requiredParams) }}
  displayName: Check for required parameters
  inputs:
    targetType: inline
    pwsh: true
    script: |
      $params = $env:REQURED_PARAMS -split ";"
      foreach($param in $params) {
        if ([string]::IsNullOrEmpty($param.Split(":")[1])) {
          Write-Host "##vso[task.logissue type=error;]Missing template parameter $($param.Split(":")[0])"
          Write-Host "##vso[task.complete result=Failed;]"
      }
    }


Then in my azure-pipelines.yml I can do:

#other stuff
- template: templates/pipeline-template.yml
  parameters:
    endpoint: 'myEndpoint'
    rootDirectory: $(Pipeline.Workspace)/mycode

In this example, the build will fail because I don't pass the parameter archiveName.

You can add some flexibility by using variables to define the separators instead of hardcoding them in the scripts and in the expressions.

bubbleking
  • 3,329
  • 3
  • 29
  • 49
leoniDEV
  • 53
  • 1
  • 4
  • 1
    This is great, but this will fail if no parameters are specified because ```$env:REQUIRED_PARAMS -split ";"``` will return an array with an empty element. You can fix this with a quick check ```if ([string]::IsNullOrEmpty($env:REQUIRED_PARAMS) -eq $false)``` – bryanbcook May 26 '20 at 15:18
2

Script file arguments

The below example provides the syntax needed to pass a Azure DevOps yaml boolean and an array to a PowerShell script file via arguments.

boolean -> Switch
object -> Array

Powershell Script

[CmdletBinding()]
param (
    [Parameter()]
    [switch]
    $Check,

    [Parameter()]
    [string[]]
    $Array
)

If($Check.IsPresent)
{
    Write-Host "Check is present"
}
else {
    Write-Host "Check is not present"
}

Write-Host "Next we loop the array..."
Foreach($a in $Array){
    Write-Host "Item in the array: $a"
}

Yaml Pipeline

trigger: none

pool:
  vmImage: windows-latest

parameters:
  - name: checkBool
    type: boolean

  - name: paramArray
    type: object
    default:
      - one
      - two

steps:
- task: PowerShell@2
  inputs:
    filePath: 'Scripts/DebugSwitches.ps1'
    arguments: -Check:$${{ parameters.checkBool }} -Array ${{ join(', ', parameters.paramArray) }}

Boolean Syntax

Notice the yaml boolean is passed to the PowerShell switch parameter with a colon ':' with no spaces.

Array Syntax

Notice the yaml object array above uses the Join operator to format the array as a comma separated array that is passed to the PowerShell array argument.

Dejulia489
  • 1,165
  • 8
  • 14
1

How to pass complex DevOps pipeline template parameter to script

I am afraid we could not pass complex DevOps pipeline template parameters to a PowerShell script.

Currently, the task of Azure devops only supports the transfer of one-dimensional arrays. It cannot accept and transfer two-dimensional arrays. Although we can define the parameters of a two-dimensional array, but we need to extend the parameters from a template by the scripts like:

- ${{ each field in parameters.myArray}}:

We could use it like:

- ${{ each step in parameters.buildSteps }}:
  #- ${{ each pair in step }}:

    - task: PowerShell@2
      inputs:
        targetType : inline
        script: |
          Write-Host 'Hello World'

But we could not pass the two-dimensional arrays directly to the task like: [field1: 'a', field2: 'b']. That the reason why you got the error Unable to convert from Array to String.

You could check document Extend from a template for some more details.

Hope this helps.

Leo Liu
  • 71,098
  • 10
  • 114
  • 135
1

As @Leo Liu MSFT mentioned in its answer, this is indeed not supported right now but someone already opened an issue for this improvement .

This issue also contains a good workaround for now to use environment variables instead. Draw back of this solution is you need to be aware of the data structure in order to map it properly.

parameters:
  mylist:[]
  #where mylist is a sequence of object matching the mapping:
  #- name: 'the name 1'
  #  value: 'the value of 1'
  #  index: 0
  #- name: 'the name 2'
  #  value: 'the value of 2'
  #  index: 1

env:
  ${{ each item in parameters.mylist }}:
    ${{ format('SCRIPT_PARAM_{0}_KEY', item.index) }}: ${{ item.name }}
    ${{ format('SCRIPT_PARAM_{0}_VAL', item.index) }}: ${{ item.value }}
GGirard
  • 1,145
  • 1
  • 13
  • 33