128

How do I pass variables between stages in a declarative pipeline?

In a scripted pipeline, I gather the procedure is to write to a temporary file, then read the file into a variable.

How do I do this in a declarative pipeline?

E.g. I want to trigger a build of a different job, based on a variable created by a shell action.

stage("stage 1") {
    steps {
        sh "do_something > var.txt"
        // I want to get var.txt into VAR
    }
}
stage("stage 2") {
    steps {
        build job: "job2", parameters[string(name: "var", value: "${VAR})]
    }
}
John
  • 10,837
  • 17
  • 78
  • 141
  • 1
    For the write and read part, there is shash/unstash btw. – sebkraemer Nov 22 '17 at 11:34
  • 1
    what about using [environment variables](https://jenkins.io/doc/pipeline/tour/environment/), that act like global variables ? – Veverke Dec 23 '18 at 14:32
  • 1
    Env-variables and files are great... for storing **strings**. Any working ways to share **objects** (Groovy/Java instances) across stages though? – ulidtko Mar 08 '23 at 16:05

8 Answers8

134

If you want to use a file (since a script is the thing generating the value you need), you could use readFile as seen below. If not, use sh with the script option as seen below:

// Define a groovy local variable, myVar.
// A global variable without the def, like myVar = 'initial_value',
// was required for me in older versions of jenkins. Your mileage
// may vary. Defining the variable here maybe adds a bit of clarity,
// showing that it is intended to be used across multiple stages.
def myVar = 'initial_value'

pipeline {
  agent { label 'docker' }
  stages {
    stage('one') {
      steps {
        echo "1.1. ${myVar}" // prints '1.1. initial_value'
        sh 'echo hotness > myfile.txt'
        script {
          // OPTION 1: set variable by reading from file.
          // FYI, trim removes leading and trailing whitespace from the string
          myVar = readFile('myfile.txt').trim()
        }
        echo "1.2. ${myVar}" // prints '1.2. hotness'
      }
    }
    stage('two') {
      steps {
        echo "2.1 ${myVar}" // prints '2.1. hotness'
        sh "echo 2.2. sh ${myVar}, Sergio" // prints '2.2. sh hotness, Sergio'
      }
    }
    // this stage is skipped due to the when expression, so nothing is printed
    stage('three') {
      when {
        expression { myVar != 'hotness' }
      }
      steps {
        echo "three: ${myVar}"
      }
    }
  }
}
burnettk
  • 13,557
  • 4
  • 51
  • 52
  • Can we use this variable with `when {}` ? – Mostafa Hussein Nov 18 '17 at 04:21
  • 3
    you can also just use `def myVar` and use it afterwards with `echo ${myVar}` if you want to have your config at the top of the file ;) – bastianwegge Nov 21 '17 at 16:30
  • 7
    Isnt writing to file very evil and create unwanted diskio as well? – Dirkos Nov 24 '17 at 08:32
  • 4
    You're right @Dirkos but there's a better way to achieve what was requested without involving file read/writes. See this answer https://stackoverflow.com/a/43881731/1053510 – user1053510 Dec 28 '17 at 15:08
  • and if you have many variables ? and you must read them one by one – sirineBEJI Jun 14 '18 at 10:37
  • If you have many variables and you must read them one by one, add a groovy file with lines : env.variable = value and than do load file.groovy. Yes I'm answering my comment commented one month ago :p – sirineBEJI Jul 27 '18 at 15:56
  • I had to declare `def myVar = ''` _before_ the `pipeline {}` directive to get the example to run, else it fails on "unknown property" or some such error in stages 2 and 3. Suggest adding the declaration to the top of the example (can I just do that myself?!). Also, despite the age of this post the example is still valid, best description I found anywhere online. Edit: I was not writing out to a file then reading back in, just setting the var in Stage 1 then reading it in subsequent stages. – psteiner Aug 15 '18 at 00:52
  • 2
    Hi folks! This does not work if you define agents in the stages, if you switch machines between stages,... Other idea is stashing - but you need to modify an existing file for that... – eventhorizon Aug 17 '18 at 13:32
  • 1
    I'm reposting my comment from another answer, but I would recommend using a `withEnv` wrapper in the 2nd stage, so that you can use the variable in the context of `sh`. Otherwise, it will print an empty string. At least it does with Jenkins 2.124. – Sergio Dec 05 '18 at 22:16
  • Writing to a file will not help me when I want to pass data to a parallel stage at all I imagine. –  May 10 '19 at 12:19
  • 1
    @ConradB it should work fine to use the technique here as long as you set the variable before the parallel stage (in the example above, stage "three" is where you'd do the work you need to do in parallel). – burnettk May 13 '19 at 00:45
  • No this is not a proper way of declaring variables. Firs of all - you do not declare variables like this in declarative syntax. parameters and environment directives are used for that. Another thing - saving attribute value in a file is a super inefficient workaround. Please follow jenkins.io/doc/book/pipeline/syntax/#parameters as @KatieS mentioned under my post – dtotopus Jul 04 '19 at 09:29
  • i think your answer is a good one, @Zigzauer. It's good to have options. I updated my answer to make it clear that writing and reading to a file is not necessary. Thank you. – burnettk Jul 05 '19 at 04:54
  • This question was specific for declarative pipeline... is not usage of "script" is against that? – Arnab Aug 15 '19 at 15:56
  • 1
    @Arnab that's a reasonable consideration, but the script function is available as a part of the declarative pipeline DSL, and it can be useful when you want a mostly-declarative Jenkinsfile but also want to accomplish something the DSL doesn't support. – burnettk Aug 15 '19 at 19:17
  • If I declare variable at the top, I keep getting a dialog asking me to input it (INput required) – peterc Oct 05 '21 at 23:58
38

Simply:

  pipeline {
        parameters {
            string(name: 'custom_var', defaultValue: '')
        }

        stage("make param global") {
             steps {
               tmp_param =  sh (script: 'most amazing shell command', returnStdout: true).trim()
               env.custom_var = tmp_param
              }
        }
        stage("test if param was saved") {
            steps {
              echo "${env.custom_var}"
            }
        }
  }
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
dtotopus
  • 562
  • 4
  • 5
  • 2
    According to the doc pointed to by @KatieS, the parameters defined in the parameters {} bloc are accessed as ${params.custom_var} and not ${env.custom_var}. Both work but are a different variable that can contain different values. But your solution with the parameter{} bloc works fine for me accessing them via ${params.custom_var} – Martin May 10 '19 at 08:02
  • 6
    I was wrong. parameters{} are used for used provided parameters and seem to be immutable, trying to set them in the pipeline (beside assigning them a default value in parameters{}) will make the stage fail without any error message. So the env.custom_var is the way to go. In this case the parameters{} block can be left out. – Martin May 10 '19 at 09:17
  • Does this work across multiple Jenkins file. What I am trying to do is pass on the latest commit on the repo from build.JenksinsFile to deploy.JenkinsFile ? – Gurpreet Singh Drish Aug 08 '19 at 13:20
  • anyway to insert something like ${workspace} in parameters ? like string(name: 'custom_var', defaultValue: "${workspace}/a") – yuxh Aug 15 '19 at 06:07
  • 10
    Not sure how this will work. I don't think you can directly set a variable inside "steps" block without using a "script" step. – Arnab Aug 15 '19 at 16:02
  • @Martin The error stack trace is printed at the end of `pipeline.log` (but not in the individual stage for some reason): `[Pipeline] End of Pipeline java.lang.UnsupportedOperationException at java.base/java.util.Collections$UnmodifiableMap.put(Collections.java:1457)` – Leponzo Dec 15 '21 at 06:25
  • Corrected the syntax in the solution. Also used hidden parameter plugin this so that it does not show an extra parameter on jenkins UI. Works well :) – shikha singh Apr 22 '22 at 10:10
  • This solution worked for me. Please note that in newer versions of Jenkins (I'm using 2.346.2) you'll have to surround any logic (in this case updating the variable) inside a `script { }` block, otherwise you'll get an `Expected a step` error. – fsinisi90 Sep 02 '22 at 17:46
11

I had a similar problem as I wanted one specific pipeline to provide variables and many other ones using it to get this variables.

I created a my-set-env-variables pipeline

script
{
    env.my_dev_version = "0.0.4-SNAPSHOT"
    env.my_qa_version  = "0.0.4-SNAPSHOT"
    env.my_pp_version  = "0.0.2"
    env.my_prd_version = "0.0.2"
    echo " My versions  [DEV:${env.my_dev_version}] [QA:${env.my_qa_version}] [PP:${env.my_pp_version}] [PRD:${env.my_prd_version}]"
}

I can reuse these variables in a another pipeline my-set-env-variables-test

script 
{
    env.dev_version = "NOT DEFINED DEV"
    env.qa_version  = "NOT DEFINED QA"
    env.pp_version  = "NOT DEFINED PP"
    env.prd_version = "NOT DEFINED PRD"
}

stage('inject variables') {

    echo "PRE DEV version = ${env.dev_version}"
    script 
    {
       def variables = build job: 'my-set-env-variables'
       def vars = variables.getBuildVariables()
      //println "found variables" + vars
      env.dev_version = vars.my_dev_version
      env.qa_version  = vars.my_qa_version
      env.pp_version  = vars.my_pp_version
      env.prd_version = vars.my_prd_version
    }
}

stage('next job') {
    echo "NEXT JOB DEV version = ${env.dev_version}"
    echo "NEXT JOB QA version = ${env.qa_version}"
    echo "NEXT JOB PP version = ${env.pp_version}"
    echo "NEXT JOB PRD version = ${env.prd_version}"

}


Emmanuel B.
  • 226
  • 2
  • 7
7

there is no need for (hidden plugin) parameter definitions or temp-file access. Sharing varibles across stages can be acomplished by using global Groovy variables in a Jenkinsfile like so:

#!/usr/bin/env groovy 
def MYVAR

def outputOf(cmd) { return sh(returnStdout:true,script:cmd).trim(); }

pipeline {
    agent any    
    stage("stage 1") {
        steps {
            MYVAR = outputOf('echo do_something')
            sh "echo MYVAR has been set to: '${MYVAR}'"
        }
    }
    stage("stage 2") {
        steps {
            sh '''echo "...in multiline quotes: "''' + MYVAR + '''" ... '''
            build job: "job2", parameters[string(name: "var", value: MYVAR)]
        }
    }
}
Ian Carter
  • 548
  • 8
  • 7
2

I have enhanced the existing solution by correcting syntax .Also used hidden parameter plugin so that it does not show up as an extra parameter in Jenkins UI. Works well :)

properties([parameters([[$class: 'WHideParameterDefinition', defaultValue: 'yoyo', name: 'hidden_var']])]) 
  pipeline {
        agent any

stages{
        stage("make param global") {
             steps {
                 script{
               env.hidden_var = "Hello"
                 }
              }
        }
        stage("test if param was saved") {
            steps {
              echo"About to check result"
              echo "${env.hidden_var}"
            }
        }
    }
  }
shikha singh
  • 498
  • 1
  • 5
  • 10
2

pipeline {

agent any
stages {
    stage('stage 1') {
        steps {
           script {           
            def INSTANCE_ID = sh(returnStdout: true , script: 'aws ec2 run-instances --image-id ami-12345678 --count 1 --instance-type t2.micro --key-name MyKeyPair --security-group-ids sg-12345678 --subnet-id subnet-12345678')
            env.INSTANCE_ID = INSTANCE_ID
                    }
                } 
            }
    stage('stage 2') {
        steps {
           script {
            echo "This is your INSTANCE_ID, $INSTANCE_ID"    
            } 
        }
    }
}

}

B. Lakshmi
  • 159
  • 8
1

We have several variable levels, stage, script and sh and this is where majority answers do not stress enough differences so people end up using files or scripts to capture outputs... Here's what works for me.

def buildNameGlobal
pipeline {
    stages {
        stage("Initialize") {
            steps {
                script {
                    buildNameGlobal = "123"
                    echo "buildNameGlobal: ${buildNameGlobal}"
                    // will show 123
                }
            }
        }

        stage("Task 1") {
            steps {
                script {
                    echo "buildNameGlobal: ${buildNameGlobal}"
                    // will show 123 as expected but we must set env to use it from sh script
                    env.buildNameGlobalEnv = buildNameGlobal
                    sh label: '', script: '''
                    echo "buildNameGlobalEnv :$buildNameGlobalEnv "'''
                    // and now we also have 123 here
                }
            }
        }
    }
}
Peter
  • 566
  • 4
  • 14
0

I use a shared lib method in the background, that handles a global environment variable with a JSON map as data. This way, it's persistent over Jenkins controller restarts and it can store all kinds of objects:

pipelineStore.groovy

def getAll() {
    return jsonUtils.fromJson(env.PIPELINE_STORE)
}

def put(key, value) {
    def tmpMap = jsonUtils.fromJson(env.PIPELINE_STORE)
    if (!tmpMap) {
        tmpMap = [:]
    }
    def result = tmpMap.put(key, value)
    env.PIPELINE_STORE = jsonUtils.toJson(tmpMap)
    return result
}

def get(key) {
    return jsonUtils.fromJson(env.PIPELINE_STORE)[key]
}

def remove(key) {
    tmpMap = jsonUtils.fromJson(env.PIPELINE_STORE)
    def result = tmpMap.remove(key)
    env.PIPELINE_STORE = jsonUtils.toJson(tmpMap)
    return result
}

jsonUtils.groovy

import groovy.json.JsonSlurper
import groovy.json.JsonBuilder
import groovy.json.JsonSlurperClassic 

def toJson(object) {
    return new JsonBuilder(object).toPrettyString()
}

def fromJson(text) {
    if (!text) {
        return ''
    }
    try {
        return new JsonSlurper().parseText(text)
    } catch (Exception e) {
        return new JsonSlurperClassic().parseText(text)
    }
}

And in my pipelines I can use it across stages like

stages {
    stage ('1') {
        steps {
            script {
                pipelineStore.put("key_1", [name: "value_1", version: 23])
            }
        }
    }
    
    stage ('2') {
        steps {
            script {
                def value = pipelineStore.get("key_1")
            }
        }
    }
}

post {
    unsuccessful {
        node ('master') {
            script {
                pipelineStore.getAll().each { key, value ->
                    // e.g. cleanup operations
                }
            }
        }
    }
}

for more information on shared libraries on Jenkins see https://www.jenkins.io/doc/book/pipeline/shared-libraries/