7

I have a project (PROJECT_A) that is triggered through a webhook, and expects the variable $PRODUCT to be set. Its value is used to trigger a certain path in the build. The job in the .gitlab-ci.yml file looks like this:

deploy:
  stage: publish
  script:
    - ./generate_doc.sh $PRODUCT

A webhook call looks like this:

http://<GITLAB_URL>/api/v4/projects/710/ref/master/trigger/pipeline?token=<TOKEN>&variables[PRODUCT]=<PRODUCT>

I call this trigger through a webhook from other projects, including PROJECT_B. So I manually filled in the desired value in the respective webhooks, e.g. for PROJECT_B:

http://<GITLAB_URL>/api/v4/projects/710/ref/master/trigger/pipeline?token=<TOKEN>&variables[PRODUCT]=PROJECT_B

When the pipeline in PROJECT_A is triggered, $PRODUCT has the value PROJECT_B, as expected.

I would like to parameterize the pipeline further and take, among others, the commit message into account. All the information I need is apparently provided in the webhook payload.

Is there a built-in way to read this payload in a pipeline? Or alternatively, put contents of the payload into a variable in the webhook like this:

http://<GITLAB_URL>/api/v4/projects/710/ref/master/trigger/pipeline?token=<TOKEN>&variables[COMMIT_REF]=???

I have found discussions about doing parameterized Jenkins builds using the webhook payload, including this related question. There is also a similar question in the Gitlab forum, without any answer.

Is there a way to do access that payload in a Gitlab CI pipeline? I could probably extract the provided values with a jq call, but how can I get the Json in the first place?

Carsten
  • 1,912
  • 1
  • 28
  • 55
  • Any success figuring this out? – orodbhen Jun 21 '19 at 18:18
  • No, I could not parameterize further. The variables are now fixed in the Webhook call, e.g. `http://.../ref/master/trigger/pipeline?token=<...>&variables[PRODUCT]=PROJECT_A`. The triggered project has a script that handles the variable values. – Carsten Jun 26 '19 at 07:44

2 Answers2

0

If you run compgen -v to show the environment variables when triggering the pipeline in the UI (without JSON payload) you get 3 fewer lines in your job log than when POSTing a JSON payload.

The additional variables are:

  • CI_BUILD_TRIGGERED
  • CI_PIPELINE_TRIGGERED
  • TRIGGER_PAYLOAD

If you print their values out and re-run the pipeline:

echo CI_BUILD_TRIGGERED=$CI_BUILD_TRIGGERED
echo CI_PIPELINE_TRIGGERED=$CI_PIPELINE_TRIGGERED
echo TRIGGER_PAYLOAD=$TRIGGER_PAYLOAD

You get (for username YOUR_USER_NAME and repo name YOUR_REPO_NAME)

CI_BUILD_TRIGGERED=true
CI_PIPELINE_TRIGGERED=true
TRIGGER_PAYLOAD=/builds/YOUR_USER_NAME/YOUR_REPO_NAME.tmp/TRIGGER_PAYLOAD

So as you can see the payload is stored as TRIGGER_PAYLOAD in a temporary directory suffixed .tmp, which re-running the pipeline and printing it out (cat) shows it contains the payload, in my case that’s JSON.

Louis Maddox
  • 5,226
  • 5
  • 36
  • 66
0

Unless you have developed something very unique, it's practically guaranteed that the data provided in the webhook's payload is in JSON format. Gitlab automatically treats received payload data as a FILE type variable. The data is always stored in $TRIGGER_PAYLOAD, and its contents can be manipulated with cat and jq.

Gitlab's default runner for jobs is based on Alpine, and they already include jq as part of the setup. And because jobs' scripts are bash commands, getting the data out is pretty straight forward. If you're using a different image for your jobs, Ubuntu, or something, you can easily install jq in your pipeline's before_script section. For example, On my website when a new user signs up for a service, I have a webhook that sends that event data to my gitlab repo and creates a project folder.

setup_new_user:
  before_script:
    # Install jq, if necessary
    - apt-get install -y jq
  script:
    # Retrieve data from the webhook payload
    - WEBHOOK_BODY=$(cat $TRIGGER_PAYLOAD)
    - NEW_MEMBER_EMAIL=$(echo $WEBHOOK_BODY | jq -r '.members[0].email')
    - etc
user658182
  • 2,148
  • 5
  • 21
  • 36