0

I have a website written in Typescript that has a button which triggers an azure pipeline to run. I would like to pass something from the website to the pipeline as a parameter and I saw here that you can pass a .yaml structure as object to a pipeline.

Is it possible to pass a .yaml that was converted from an .xlsx file to the pipeline and how would one go about that? For clearance: The website has a file upload and I need the content of the .xlsx-file the user passes in one of the steps in the pipeline. There is no backend, just the website.

If that's not possible, how should I do it?

hardyta
  • 13
  • 3

1 Answers1

0

Since you can pass a .yaml structure as object to a pipeline. You can try below workaround.

Define Runtime parameters in your pipeline to hold the value contents of the .xlsx-file. See below:

parameters:
- name: contentKey
  displayName: Pool Image
  default: contentDefaultValue

Then You can use pipeline run rest api in your website and provide the templateParameters in the request body to override the Runtime parameters defined in your pipeline with the contents of the .xlsx-file. See below:

{
  "templateParameters":{
     "contentKey": "contentValue"
  }
}

If you have to pass the yaml file in the pipeline. You can try to upload the yaml file to azure devops. And then download the yaml file in your pipeline. So that the pipelines steps can access the yaml file.

Below are the possible methods you can use to upload the yaml file to azure devops.

1, you can create a repository in your azure devops project to hold the yaml file. And upload the file to the repository via api in your website. See example here. See rest api here.

Then you can run git clone command in a script task to download the file in your pipeline.

2, you can use upload the file to workitem attachment. See rest api here.

And pass the attachment id to the pipeline when your run the pipeline(you can refer to above workaround and define a Runtime parameters to hold the id value).

Then you need to call rest api to get the yaml file in a script task in your pipeline.

3, Upload the yaml file to azure devops secure file. See this thread.

Then use download secure file task to download the yaml file in your pipeline.

Hope above helps!

Update:

In yaml pipeline file. You can define your parameter as below:

parameters:
  - name: paramname
    type: object
    displayName: 'configure path'
    default: 
      param1: '[{\"a\":\"x\",\"b\":\"y\"},{\"a\":\"x\",\"b\":\"y\"}]'
      param2: 'string1'
      param3: 'string2'

In the rest api. You can pass the request body as below:

{
  "templateParameters":{
        "paramname": "{\"param1\":\"'[{\\'a\\':\\'x\\',\\'b\\':\\'y\\'},{\\'a\\':\\'x\\',\\'b\\':\\'y\\'}]'\",\"param2\":\"string11\", \"param3\":\"string22\"}"
      }
}

Then you can access the parameter in the bash task like below:

 echo "${{parameters.paramname.param1}}"
 echo "${{parameters.paramname.param2}}"
Levi Lu-MSFT
  • 27,483
  • 2
  • 31
  • 43
  • Thanks a lot for the elaborate answer! The first approach looks like what I need. But I have a question about that method: What does `displayName: Pool Image` do? Is Pool Image a place holder that I need to replace with something? – hardyta Sep 18 '20 at 14:14
  • It is the display name of the parameter you see on the ui page when you queue the pipeline. You can replace it or donot specify a display name at all. – Levi Lu-MSFT Sep 21 '20 at 01:48
  • Thanks for the info on `displayName`! I will accept the answer as soon as I tried it! I had to work on some other stuff on the website and will get to this part today or tomorrow. – hardyta Sep 21 '20 at 07:46
  • could you kindly help me figure out how to properly access the values of the object? I seem to be doing it wrong. my object has a structure like this: `[param1:[{a:'x',b:'y',c:'z'},{a:'x1',b:'y1',c:'z1'}...], param2:'string', param3:'string']`. When I try to run the pipeline where i put this task in a job: `- bash: echo ${{parameters.paramname.param1}}` or `- bash: echo ${{parameters.paramname[0][0]}}` I don't see anything in the output. it seems to be able to read the structure, because in the second example if I leave one `[0]` out it says 'can't convert object to string' – hardyta Sep 22 '20 at 14:26
  • Ah, this has to be confusing. Since I had to make some changes on the website, there were two more parameters I figured I need to send to the pipeline which I'm trying to achieve by adding those variables together to one big JSON object containing the original data/ xlsx data, and two strings. – hardyta Sep 22 '20 at 15:17
  • You can check the example in above update to access the values of the object. – Levi Lu-MSFT Sep 23 '20 at 04:24
  • Thanks for the example. Works as it should. It didn't in my example because provided the json in a wrong format as I manually typed the parameters in, because the pipeline trigger doesn't work right now but that's a different problem. When I take the raw JSON that the website is sending to azure and use that in a manual test as parameters, it works. If the answer wasn't already accepted, I'd do it right now. Thanks again :) – hardyta Sep 23 '20 at 11:15
  • Maybe you know how to answer this one as well? https://stackoverflow.com/questions/68064726/how-to-get-an-external-file-path-in-an-azure-pipeline-task – CodeMonkey Jun 23 '21 at 06:58