87

I have two Jenkins pipelines, let's say pipeline-A and pipeline-B. I want to invoke pipeline-A in pipeline-B. How can I do this?

(pipeline-A is a subset of pipeline-B. Pipeline-A is responsible for doing some routine stuff which can be reused in pipeline-B)

I have installed Jenkins 2.41 on my machine.

Mike P
  • 742
  • 11
  • 26
Yash
  • 2,944
  • 7
  • 25
  • 43

6 Answers6

77

Following solution works for me:

pipeline {
    agent
    {
        node {
                label 'master'
                customWorkspace "${env.JobPath}"
              }
    }

    stages 
    {
        stage('Start') {
            steps {
                sh 'ls'
            }
        }

        stage ('Invoke_pipeline') {
            steps {
                build job: 'pipeline1', parameters: [
                string(name: 'param1', value: "value1")
                ]
            }
        }

        stage('End') {
            steps {
                sh 'ls'
            }
        }
    }
}

Adding link of the official documentation of "Pipeline: Build Step" here: https://jenkins.io/doc/pipeline/steps/pipeline-build-step/

Yash
  • 2,944
  • 7
  • 25
  • 43
  • 3
    How is this working for you? Is "pipeline1" actually a "pipeline{}"? I get this error: "Waiting for non-job items is not supported". I think this is just for jobs not full pipelines – red888 Sep 16 '17 at 15:32
  • pipeline1 is actually another pipeline name that you need to invoke here. – Yash Jan 02 '18 at 05:45
  • 4
    @red888 If you get the error `ERROR: Waiting for non-job items is not supported`, you may need to specify the branch: `build job: 'pipeline1/master', parameters: [` – Céline Aussourd May 23 '18 at 16:42
  • Is there a way to know the name of parent pipeline when child pipeline is running? – Yash Jul 19 '18 at 11:02
  • 1
    You can pass the JOB_NAME environment variable as a paramter to the child job. – Matias Snellingen Jan 16 '19 at 16:20
  • Do I need always a defined Jenkins-Job (as Pipeline-Job) for calling with "build job"? Or can I call the Groovy-Code with a complete pipeline direct from parent pipeline? An example: I have a parametrized pipeline for building something. With different small other pipelines I want to start the parametrized pipeline with different parameter-combinations. Do I need a (parametrized)) jenkins-job for the build-pipeline or can I call the big pipeline direct? – afischer Aug 26 '20 at 11:31
  • I have tried with Pipeline Job only but it should work for other standard jobs als well. – Yash Aug 31 '20 at 05:26
  • This question has gained lot of traction in last years. Has anyone succeeded in getting job path or job ID of the child job from parent job or vice versa? – Yash May 27 '21 at 10:09
  • I’m trying to start a new job inside a jenkinsfile using `build(job:”app/other/repo/)` But It is failing due to “ Waiting for non-job items is not supported” How can start the new job from another repo from the first Jenkinsfile? – elulcao Jun 05 '21 at 13:46
  • Watch out that the new build job might run on a different Git commit than the parent job if the branch has changed in the meantime. – clonejo Apr 13 '23 at 11:41
  • i see your 'stringifying' the parameters. so you are not able to pass a map as a parameter? – mike01010 May 17 '23 at 22:51
74

A little unclear if you want to invoke another pipeline script or job, so I'll answer both:

Pipeline script

The "load" step will execute the other pipeline script. If you have both scripts in the same directory, you can load it like this:

def pipelineA = load "pipeline_A.groovy"
pipelineA.someMethod()

Other script (pipeline_a.groovy):

def someMethod() {
    //do something
}

return this

Pipeline job

If you are talking about executing another pipeline job, the build job step in your Jenkinsfile can accomplish this:

build job: '<Project name>', propagate: true, wait: true

propagate: Propagate errors

wait: Wait for completion

If you have parameters for the job, you can add them like this:

build job: '<Project name>', parameters: [[$class: 'StringParameterValue', name: 'param1', value: 'test_param']]
mirekphd
  • 4,799
  • 3
  • 38
  • 59
  • Just want to know how can I get the status of child pipeline in the parent pipeline. I want to proceed the parent pipeline based on the result of the child pipeline. – Yash Aug 30 '17 at 05:29
  • Is there a way to pass "current build parameters" from one job to another? as in the "old jenkins" – Shahar Hamuzim Rajuan Feb 01 '18 at 13:39
  • I used the first example from 'pipeline job', with propagate and wait, but Jenkins gives me this error: ERROR: Waiting for non-job items is not supported. Any clue what I did wrong? – P Kuijpers May 02 '18 at 14:14
  • 6
    I suspect you try to start a job that does not exist or you use the wrong name. If you for example want to call a multibranch job, use: `build job: "my-job/my-branch-name", propagate: true, wait: true`. – Matias Snellingen May 03 '18 at 15:51
  • Is there a way to dynamically use the environmental BRANCH_NAME variable from the multibranch job? Something like `build job: "my-job/$BRANCH_NAME", propagate: true, wait: true`? I keep getting error with the syntax I'm using. I've tried env.BRANCH_NAME as well. The errors are "no items named my-job/$BRANCH_NAME found (I am using my actual job name in place of "my-job"). – msteppe91 Oct 02 '18 at 16:15
  • @msteppe91 the pipeline for the downstream branch needs to exist already before you can build. I'm running into the same error, can't find a workaround. The environment variable should be used like this, though: `${BRANCH_NAME}` – colti Oct 11 '18 at 20:33
  • 2
    @colti The downstream branch already existed. What I needed to do was specify BRANCH_NAME as `${env.BRANCH_NAME}`. So my final call turned out to be something like: `build job: "Downstream_Job/${env.BRANCH_NAME}", parameters: [string(name: 'some_param', value: 'true')]` – msteppe91 Oct 15 '18 at 11:52
  • 2
    How could one show the stages in the sub-pipeline show up in the BlueOcean UI? – handras Jan 15 '19 at 14:03
  • if i call the other script, it uses a different workspace. is there a way to prevent this, so that both scripts are using the same? – benez Feb 26 '19 at 22:30
  • You can use the `ws(c:\workspace\path\here){ code here }` to set location of the workspace your job runs in. – Matias Snellingen Feb 28 '19 at 08:54
  • “build job: "my-job/my-branch-name", propagate: true, wait: true“ is this syntax valid for another repo? For example, can I use this: “my-job/path/another/repo” – elulcao Jun 05 '21 at 13:28
  • Hi, what if my multibranch pipeline in the folder? Is it `build job: 'name-folder/pipeline1/master'`, right? – Dhody Rahmad Hidayat Feb 10 '22 at 08:36
4

As mentioned by @Matias Snellingen and @Céline Aussourd, in the case of launching a multibranch job you have to specify the branch to build like this :

stage ('Invoke_pipeline') {
    steps {
        build job: 'pipeline1/master', parameters: [
        string(name: 'param1', value: "value1")
        ]
    }
}

In my case it solved the problem.

Michaël COLL
  • 1,153
  • 13
  • 18
2

I am going to post my solution, which is similar to @Michael COLL, @Matias Snellingen, and @Céline Aussourd. For the multibranch pipeline I am using the following code in Jenkinsfile to trigger my multibranch B with multibranch A (in the example there are two cases for pipeline and multibranch pipeline):

post {      
      always {
            echo 'We are in post part and Jenkins build with QA tests is going to be triggered.'
            // For triggering Pipeline
            //build job: 'WGF-QA WITH ALLURE', parameters: [string(name: 'QA-Automation', value: 'value from Build pipeline')]
            // For triggering Multibranch Pipeline
            build job: 'Testing QA/QA Selenium Tests/feature%2FGET-585', parameters: [string(name: 'QA-Automation', value: 'value from Build pipeline')]
      } 
    }

Just be sure to define the whole path to the branch as is defined in the case and instead of / in branch name use %2F (feature/GET-585 -> feature%2FGET-585).

Martin
  • 151
  • 1
  • 8
1

To add to what @matias-snellingen said. If you have multiple functions, the return this should be under the function that will be called in the main pipeline script. For example in :

def someMethod() {
   helperMethod1() 
   helperMethod2()
} 
return this 

def helperMethod1(){ 
   //do stuff
} 

def helperMethod2(){
  //do stuff
}

The someMethod() is the one that will be called in the main pipeline script

papigee
  • 6,401
  • 3
  • 29
  • 31
0

Another option is to create a package, load it and execute it from the package.

package name.of.package
import groovy.json.*

def myFunc(var1) {
return result
}

Than consume it

@Library('name_of_repo')
import name.of.package.* 
utils = new name_of_pipeline()
// here you can invoke
utils.myFunc(var)

hope it helps

IsaacE
  • 305
  • 2
  • 10