0

I have several projects with very similar pipelines, e.g.:

pipeline {
    agent any
    environment {
        JAVA_HOME = '...'
    }
    options {
        timeout(time: 1, unit: 'HOURS')
    }
    parameters {
        extendedChoice(
            name: 'modules',
            description: 'Modules to build',
            type: 'PT_MULTI_SELECT',
            value: 'app,common,data,ui',
            defaultValue: 'app'
        )
    }
    stages {
        stage('Build') {
            steps {
                script {
                    def modules = params.modules.split(',')

                    withGradle {
                        if ('app' in modules) {
                            sh './gradlew app:assemble'
                        }

                        // other modules...
                    }
                }

                archiveArtifacts(artifacts: '**/build/outputs/**/*.jar', allowEmptyArchive: true)
            }
        }
    }
}

I'd like to avoid copying this boilerplate to new projects, so creating a shared library sounds like a good approach.

However, every project has different modules. I'd like these to be configurable at the project level, e.g.:

// my-shared-library/vars/bootstrap.groovy
def call(body) {
    pipeline {
        agent any
        environment {
            JAVA_HOME = '...'
        }
        options {
            timeout(time: 1, unit: 'HOURS')
        }
        stages {
            stage('Build') {
                steps {
                    script {
                        // get parameters from project
                        body()

                        def modules = params.modules.split(',')

                        withGradle {
                            for (module in modules) {
                                sh "./gradlew $module:assemble"
                            }
                        }
                    }

                    archiveArtifacts(artifacts: '**/build/outputs/**/*.jar', allowEmptyArchive: true)
                }
            }
        }
    }
}

Ideally I'd be able to use the shared library above like this in one of my projects:

library 'my-shared-library@master'

bootstrap {
    parameters {
        extendedChoice(
            name: 'modules',
            description: 'Modules to build',
            type: 'PT_MULTI_SELECT',
            value: 'app,common,data,ui',
            defaultValue: 'app'
        )
    }
}

When I run this pipeline, the modules parameter is null. I also tried defining parameters outside of the bootstrap block like so:

library 'my-shared-library@master'

properties([
    parameters([
        extendedChoice(
            name: 'Modules',
            description: 'Modules to build',
            type: 'PT_MULTI_SELECT',
            value: 'app,common,data,ui',
            defaultValue: 'app'
        )
    ])
])

bootstrap()

This pipeline throws the following error:

org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 13: unexpected token: extendedChoice @ line 13, column 13.
               extendedChoice(
               ^

Is something like this possible? I'm struggling to find documentation for this use case, particularly regarding declarative pipelines, though I found a few examples that are pretty close:

Big McLargeHuge
  • 14,841
  • 10
  • 80
  • 108
  • You can absolutely pass Pipeline parameters as method arguments to your shared libraries and their associated global vars. The questions you link to also explain how to do it. Could you please elaborate on where the confusion is? Have you looked at the Pipeline shared libraries documentation? – Matthew Schuchard Jun 25 '20 at 17:02
  • @MattSchuchard the first two questions I linked to are using scripted pipeline syntax instead of declarative, which seems to make a difference. The last question requires the job to be suspended while waiting for user input, which I don't want. My original question contained pseudocode but I updated it with actual code and described the specific problem I ran into. – Big McLargeHuge Jun 25 '20 at 17:54
  • Ok you are putting the entire pipeline within a single global var method that is also I believe the Groovy equivalent of a singleton (I could be mistaken). That I am not completely sure about. You could make the scope more granular and that would make this much easier. – Matthew Schuchard Jun 25 '20 at 18:01
  • I haven't tried it yet, but it looks like this plugin is designed to solve this exact problem: https://plugins.jenkins.io/remote-file/ – Big McLargeHuge Jul 04 '20 at 15:10

0 Answers0