102

I have two jobs in jenkins, both of which need the same parameter.

How can I run the first job with a parameter so that when it triggers the second job, the same parameter is used?

Rob Kielty
  • 7,958
  • 8
  • 39
  • 51
Stefan Kendall
  • 66,414
  • 68
  • 253
  • 406
  • We can use so many ways: One best way is use current job parameters, Or use predefined parameters in the trigger downstream job – ksr Jul 29 '18 at 22:21
  • 1
    This title is so confusing. How is this "passing variables between jobs?". Also accepted answer is a plugin. Fancy that! –  Mar 25 '20 at 02:52

12 Answers12

73

You can use Parameterized Trigger Plugin which will let you pass parameters from one task to another.

You need also add this parameter you passed from upstream in downstream.

Wei Wei
  • 75
  • 1
  • 6
Łukasz Rżanek
  • 5,786
  • 27
  • 37
  • 15
    Hi sorry for sounding like a noob, but is it okay if someone can edit this with details on how to do it with the Parameterized Trigger Plugin? – Fadi Oct 20 '15 at 19:04
  • The plugin page has a decent explanation on how to proceed and after the install question marks on the right hand side of each option have a pretty good explanation on how to use the plugin as well. What else do you require? – Łukasz Rżanek Nov 04 '15 at 15:30
  • 14
    Side note: It doesn't look like exported environment variables created in bash script sections are eligible for substitution in the output parameters (for example 'export VERSION' won't make 'UPSTREAM_VERSION=$VERSION' take the correct value; it just gets '$VERSION' instead). – Mark McKenna Mar 22 '16 at 15:04
  • 29
    This answer is insufficient – tarabyte Mar 31 '16 at 17:08
  • 7
    I agree that there should be some sort of example how to pass the parameters to the target job. The current Parameterized Trigger Plugin page does not give good information about this. There could be e.g. what kind of syntax you should use in passing the parameters. – skrii Apr 20 '16 at 14:50
  • 3
    The plugin doesn't seem to work anymore. See the [long list of open issues](https://issues.jenkins-ci.org/browse/JENKINS-36917?jql=project%20%3D%20JENKINS%20AND%20status%20in%20(Open%2C%20%22In%20Progress%22%2C%20Reopened)%20AND%20component%20%3D%20%27parameterized-trigger-plugin%27). I cannot pass any parameter values with this plugin anymore. Any other solution? – Markus L Jul 28 '16 at 11:13
  • 1
    Since that answer was given, Jenkins went through a lot of dramatic changes. If you use an old data flow, this plugin will work. Otherwise, you will need to use DSL of their Pipeline implementation. I will try to update the answer later to reflect that... – Łukasz Rżanek Mar 14 '17 at 18:02
43

1.Post-Build Actions > Select ”Trigger parameterized build on other projects”

2.Enter the environment variable with value.Value can also be Jenkins Build Parameters.

Detailed steps can be seen here :-

https://itisatechiesworld.wordpress.com/jenkins-related-articles/jenkins-configuration/jenkins-passing-a-parameter-from-one-job-to-another/

Hope it's helpful :)

Vinu Joseph
  • 965
  • 9
  • 11
28

The accepted answer here does not work for my use case. I needed to be able to dynamically create parameters in one job and pass them into another. As Mark McKenna mentions there is seemingly no way to export a variable from a shell build step to the post build actions.

I achieved a workaround using the Parameterized Trigger Plugin by writing the values to a file and using that file as the parameters to import via 'Add post-build action' -> 'Trigger parameterized build...' then selecting 'Add Parameters' -> 'Parameters from properties file'.

Community
  • 1
  • 1
Nigel Kirby
  • 408
  • 4
  • 5
  • This is what i needed. Thanks. – sdot257 Jun 07 '16 at 15:03
  • If you're willing to use the jenkins 2.x pipeline, you can use writeFile/stash->unstash/readFile to copy state data between jobs. http://www.slideshare.net/ericlongtx/jenkins-days-workshop-pipelines-eric-long Checkout slide 21 for an example. – siesta Dec 07 '16 at 04:21
  • This is required if you want SHELL variables to pass through. Much appreciated for this answer. – Carl Wainwright Sep 05 '19 at 22:16
22

I think the answer above needs some update:

I was trying to create a dynamic directory to store my upstream build artifacts so I wanted to pass my upstream job build number to downstream job I tried the above steps but couldn't make it work. Here is how it worked:

  1. I copied the artifacts from my current job using copy artifacts plugin.
  2. In post build action of upstream job I added the variable like "SOURCE_BUILD_NUMBER=${BUILD_NUMBER}" and configured it to trigger the downstream job.
  3. Everything worked except that my downstream job was not able to get $SOURCE_BUILD_NUMBER to create the directory.
  4. So I found out that to use this variable I have to define the same variable in down stream job as a parameter variable like in this picture below:

enter image description here

This is because the new version of jenkins require's you to define the variable in the downstream job as well. I hope it's helpful.

Tarun
  • 830
  • 7
  • 12
  • Totally agree. This is a mandatory update which 100% completes the initial answer. – CodeSlave Feb 05 '20 at 09:21
  • I also tried the two more upvoted options but neither of those worked until adding the extra configuration outlined in step 4 above. I did not need to have copy artifacts enabled for it to work. – Jeff Fol Nov 10 '20 at 19:38
10

(for fellow googlers)

If you are building a serious pipeline with the Build Flow Plugin, you can pass parameters between jobs with the DSL like this :

Supposing an available string parameter "CVS_TAG", in order to pass it to other jobs :

build("pipeline_begin", CVS_TAG: params['CVS_TAG'])
parallel (
   // will be scheduled in parallel.
   { build("pipeline_static_analysis", CVS_TAG: params['CVS_TAG']) },
   { build("pipeline_nonreg", CVS_TAG: params['CVS_TAG']) }
)
// will be triggered after previous jobs complete
build("pipeline_end", CVS_TAG: params['CVS_TAG'])

Hint for displaying available variables / params :

// output values
out.println '------------------------------------'
out.println 'Triggered Parameters Map:'
out.println params
out.println '------------------------------------'
out.println 'Build Object Properties:'
build.properties.each { out.println "$it.key -> $it.value" }
out.println '------------------------------------'
Offirmo
  • 18,962
  • 12
  • 76
  • 97
  • Build Flow Plugin is deprecated, users should migrate to https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin – vhamon Oct 03 '19 at 14:51
9

Just add my answer in addition to Nigel Kirby's as I can't comment yet:

In order to pass a dynamically created parameter, you can also export the variable in 'Execute Shell' tile and then pass it through 'Trigger parameterized build on other projects' => 'Predefined parameters" => give 'YOUR_VAR=$YOUR_VAR'. My team use this feature to pass npm package version from build job to deployment jobs

UPDATE: above only works for Jenkins injected parameters, parameter created from shell still need to use same method. eg. echo YOUR_VAR=${YOUR_VAR} > variable.properties and pass that file downstream

Shawn
  • 131
  • 2
  • 5
  • Thanks. This is the only answer that works for me. Job 1 has a shell script that runs "echo HELLO=world > /tmp/name-of-job-one.txt", then use "Trigger parameterized build on other projects" to trigger job-two with "Parameters from properties file" /tmp/name-of-job-one.txt. If job-two expects the parameter HELLO, and prints the parameter's value, its value will be "world". – alberto56 Jun 12 '23 at 15:27
4

I faced the same issue when I had to pass a pom version to a downstream Rundeck job.

What I did, was using parameters injection via a properties file as such:

1) Creating properties in properties file via shell :

Build actions:

  • Execute a shell script
  • Inject environment variables

E.g : properties definition

2) Passing defined properties to the downstream job : Post Build Actions :

  • Trigger parameterized build on other project
  • Add parameters : Current build parameters
  • Add parameters : predefined parameters

E.g : properties sending

3) It was then possible to use $POM_VERSION as such in the downstream Rundeck job.

/!\ Jenkins Version : 1.636

/!\ For some reason when creating the triggered build, it was necessary to add the option 'Current build parameters' to pass the properties.

Eli Mous
  • 71
  • 5
  • EDIT : Found a blooper in what I wrote. In properties definition, it should have been: echo POM_VERSION=$POM_VERSION > play.properties and not : echo $POM_VERSION >> play.properties Sorry about that. – Eli Mous Aug 22 '17 at 19:08
3

Reading through the answers, I don't see another option that I like so will offer it as well. I love the parameterization of jobs, but it doesn't always scale well. If you have jobs which are not directly downstream of the first job but farther down the pipeline, you don't really want to parameterize every job in the pipeline so as to be able to pass the parameters all the way through. Or if you have a large number of parameters used by a variety of other jobs (especially those not necessarily tied to one parent or master job), again parameterization doesn't work.

In these cases, I favor outputting the values to a properties file and then injecting that in whatever job I need using the EnvInject plugin. This can be done dynamically, which is another way to solve the issue from another answer above where parameterized jobs were still used. This solution scales very well in many scenarios.

aleb
  • 2,490
  • 1
  • 28
  • 46
tbradt
  • 626
  • 6
  • 8
2

This could be done via groovy function:

upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
    stage {
        steps {
            build job: "my_downsteam_job_name",
            parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
        }
    }
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
    if(params.CREDENTIALS_ID) {
        return params.CREDENTIALS_ID;
    } else {
        return "default_credentials_id";
    }
}
pipeline {
    environment{
        TEST_PASSWORD = credentials("${getCredentialsId()}")
    }
}
klapshin
  • 761
  • 8
  • 14
1

You can use Hudson Groovy builder to do this.

First Job in pipeline

enter image description here

Second job in pipeline

enter image description here

CAMOBAP
  • 5,523
  • 8
  • 58
  • 93
0

I figured it out!

With almost 2 hours worth of trial and error, i figured it out.

This WORKS and is what you do to pass variables to remote job:

    def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2=${env.param2}")

Use \n to separate two parameters, no spaces..

As opposed to parameters: '''someparams'''

we use paramters: "someparams"

the " ... " is what gets us the values of the desired variables. (These are double quotes, not two single quotes)

the ''' ... ''' or ' ... ' will not get us those values. (Three single quotes or just single quotes)

All parameters here are defined in environment{} block at the start of the pipeline and are modified in stages>steps>scripts wherever necessary.

I also tested and found that when you use " ... " you cannot use something like ''' ... "..." ''' or "... '..'..." or any combination of it...

The catch here is that when you are using "..." in parameters section, you cannot pass a string parameter; for example This WILL NOT WORK:

    def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob' paramters: "param1=${env.PARAM1}\nparam2='param2'")

if you want to pass something like the one above, you will need to set an environment variable param2='param2' and then use ${env.param2} in the parameters section of remote trigger plugin step

Dharman
  • 30,962
  • 25
  • 85
  • 135
0

You can also make a job write into a properties file somewhere and have another job read it. One of the way to do that is to inject variables via EnvInject plugin.

Shivam Mishra
  • 1,731
  • 2
  • 11
  • 29