51

I would like to share a variable across two steps.

I define it like:

- export MY_VAR="FOO-$BITBUCKET_BUILD_NUMBER"

but then when I try to print it in other step:

- echo $MY_VAR

it's empty.

How I can share such variable?

pixel
  • 24,905
  • 36
  • 149
  • 251

4 Answers4

43

As Mr-IDE and Rik Tytgat explained, you can export your environment variables by writing them to a file and then share this file with a following step as an artifact. One way to do so is to write your variables to a shell script in a step, define it as an artifact and then source it in the next step.

definitions:
  steps:
    - step: &build
        name: Build
        script:
          - MY_VAR="FOO-$BITBUCKET_BUILD_NUMBER"
          - echo $MY_VAR
          - echo "export MY_VAR=$MY_VAR" >> set_env.sh
        artifacts: # define the artifacts to be passed to each future step
          - set_env.sh
    - step: &deploy
        name: Deploy
        script:
            # use the artifact from the previous step
          - cat set_env.sh 
          - source set_env.sh
          - echo $MY_VAR

pipelines:
  branches:
    master:
      - step: *build
      - step:
          <<: *deploy
          deployment: test

NB: In my case, the step which publish set_env.sh as an artifact is not always part of my pipelines. In this case, be sure to check if the file exists in the next step before using it.

- step: &deploy
  name: Deploy
  image: alpine
  script:
    # check if env file exists
    - if [ -e set_env.sh ]; then
    -   cat set_env.sh
    -   source set_env.sh
    - fi
belgacea
  • 1,084
  • 1
  • 15
  • 33
  • 6
    In case anyone else is wondering about the << syntax in the example... "The <<: operator in YAML is usable to import the contents of one mapping into another, similarly to the ** double-splat operator in Python or ... object destructuring operator in JavaScript" - https://stackoverflow.com/a/41065222 – Opentuned Dec 16 '20 at 16:45
  • 2
    this will expose the environment variables in a downloadable file, right? that could be a security issue – Victor Ferreira Dec 14 '21 at 02:54
  • @VictorFerreira Yes, but as long as your repo isn't public, it should be ok-ish. There is no way to [delete `artifacts`](https://jira.atlassian.com/browse/BCLOUD-21188) right now and I can't think of a way to safely cypher it without making it a mess. Anyway, use GitHub for your own sake... – belgacea Dec 15 '21 at 09:30
  • It is unfortunate that there's no `environment:` or `variables:` option so you can define options for a shared step, and then run the same step, with different values in parallel. The additional amount of code nearly negates just copying the step several times. Another option is to only use BB pipelines as a runner of shell scripts so you pass the variables as arguments to the script. A lot simpler in the yaml file, but has its own issues that may or may not make the issue better/worse. – Richard A Quadling Feb 03 '22 at 17:57
38

For some reason, exported environment variables are not retained between the child items of a "step:" or between the top-level "step:" items (more info about these definitions here). But you can copy all the environment variables to a file, then read them back again, because files are preserved between steps:

1. Share variables between the child items of a "step:"

How to share variables between "script:" and "after-script:"

pipelines:
  default:
    - step:
        script:
          # Export some variables
          - export MY_VAR1="FOO1-$BITBUCKET_BUILD_NUMBER"
          - export MY_VAR2="FOO2-$BITBUCKET_BUILD_NUMBER"
          - echo $MY_VAR1
          - echo $MY_VAR2

          # Copy all the environment variables to a file, as KEY=VALUE, to share to other steps
          - printenv > ENVIRONMENT_VARIABLES.txt

        after-script:
          # If the file exists, read all the previous environment variables
          # from the file, and export them again
          - |
            if [ -f ENVIRONMENT_VARIABLES.txt ]; then
                export $(cat ENVIRONMENT_VARIABLES.txt | xargs)
            fi
          - echo $MY_VAR1
          - echo $MY_VAR2

Note: Try to avoid using strings that have spaces or new line characters in them (for the keys and values). The export command will have trouble reading them, and can throw errors. One possible workaround is to use sed to automatically delete any line that has a space character in it:

# Copy all the environment variables to a file, as KEY=VALUE, to share to other steps
- printenv > ENVIRONMENT_VARIABLES.txt
# Remove lines that contain spaces, to avoid errors on re-import (then delete the temporary file)
- sed -i -e '/ /d' ENVIRONMENT_VARIABLES.txt ; find . -name "ENVIRONMENT_VARIABLES.txt-e" -type f -print0 | xargs -0 rm -f

More info:

2. Share variables between the top-level "step:" items

pipelines:
  default:
    - step:
        script:
          - export MY_VAR1="FOO1-$BITBUCKET_BUILD_NUMBER"
    - step:
        script:
          - echo $MY_VAR1 # This will not work

In this scenario, Bitbucket Pipelines will treat the 2 "step:" items as completely independent builds, so the second "step:" will start from scratch with a blank folder and a new git clone.

So you should share files between steps by using declared artifacts, as shown in the answer by belgacea (19 Dec 2019).

Mr-IDE
  • 7,051
  • 1
  • 53
  • 59
  • 5
    This will not work, because the repository is not updated and the next step will start with the `git clone` of the branch/commit, and your `ENVIRONMENT_VARIABLES.txt` will be gone. This **is** possible using artifacts, though. Add `ENVIRONMENT_VARIABLES.txt` as an artifact in your `bitbucket-pipelines.yml` file and it will be available in the next step (if the next step is run within 7 days, artifacts are removed after a week). – Rik Tytgat May 31 '19 at 13:51
  • @RikTytgat - This process refers to **one** single run of the build machine. It's not possible to retain exported variables from previous builds of the CI pipeline, and use them on future builds. – Mr-IDE Dec 19 '19 at 13:59
  • I am aware of that, but the question was about retaining a variable between 2 steps in the same run, not between 2 runs. – Rik Tytgat Dec 20 '19 at 15:42
  • 1
    I've been missing variables between pipeline steps. This was the answer I was looking for. – jsonUK Jul 30 '21 at 14:34
  • @Mr-IDE does `find . -name "ENVIRONMENT_VARIABLES.txt-e"` have a typo `-e` in the file name? – vladimirror Oct 21 '21 at 11:58
  • 2
    @vladimirror No, `"ENVIRONMENT_VARIABLES.txt-e"` is correct. However, the entire `find` command may not be needed. I only added it because `sed -i -e` on MacOS causes a new temporary file to be created (with "-e" appended to the filename). But `sed` on Linux/Ubuntu does not have this bug. Bitbucket Pipelines normally runs on Linux/Ubuntu machines. – Mr-IDE Oct 21 '21 at 12:24
22

I'm afraid, but it seems impossible to share environment variable from one step to another, BUT you can define global environment variables for all steps in the settings of the project under pipelines category.

Settings -> Pipelines -> Repository Variables
Jack Klimov
  • 378
  • 2
  • 10
6

I know this question is rather old, but I've found a cleaner approach without uploading and downloading artifacts across steps.

Instead of defining an anchored step, you could anchor a script with the EXPORT commands in the definition and reuse it explicitly as part of a step. Note that the script defined in a script anchor is a one-liner and needs && for multiple commands.

definitions:
  commonItems:
    &setEnv export MY_VAR="FOO-$BITBUCKET_BUILD_NUMBER" && 
      export MY_VAR_2="Hey" &&
      export MY_VAR_3="What you're building"

Here's how you would call it in your steps.

steps:
  step:
    - name: First step
      script:
        - *setEnv
        - echo $MY_VAR # FOO-1
        - echo $MY_VAR_2 # Hey
        - echo $MY_VAR_3 # What you're building

    - name: Second step
      script:
        - *setEnv
        - echo $MY_VAR # FOO-1
        - echo $MY_VAR_2 # Hey
        - echo $MY_VAR_3 # What you're building
Akmal
  • 474
  • 1
  • 4
  • 12