34

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in

properties([
  parameters([
    string(name: 'a',       defaultValue: 'aa',     description: '*', ),
    string(name: 'b',   description: '*', ),
    string(name: 'c',       description: '*', ),
   ])
])

any clues?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
NicolasW
  • 1,519
  • 5
  • 22
  • 34
  • Same problem here... I added "Suppress automatic SCM triggering" to avoid autobuilds. That worked - too good - because now even a "Scan Multibranch Pipeline Now" is disabled and I am no longer able to "rescan" the branches... – eventhorizon Nov 17 '17 at 13:54
  • Finally I added a "default" (first entry) to a choice parameter and checked the param in the first / in every stage... Bad but working workaround... – eventhorizon Nov 17 '17 at 14:47
  • 1
    The following issue might be important - if ever implemented - https://issues.jenkins-ci.org/browse/JENKINS-38442 – eventhorizon Nov 17 '17 at 14:48

6 Answers6

44

One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.

pipeline {
    agent any
    parameters {
        booleanParam(name: 'Refresh',
                    defaultValue: false,
                    description: 'Read Jenkinsfile and exit.')
    }
    stages {
        stage('Read Jenkinsfile') {
            when {
                expression { return parameters.Refresh == true }
            }
            steps {
                echo("Ended pipeline early.")        
            }
        }
        stage('Run Jenkinsfile') {
            when {
                expression { return parameters.Refresh == false }
            }
            stage('Build') {
                // steps
            }
            stage('Test') {
                // steps
            }
            stage('Deploy') {
                // steps
            }
        }
    }
}

There really must be a better way, but I'm yet to find it :(

TomDotTom
  • 6,238
  • 3
  • 41
  • 39
  • It's actually crazy this is the only method. You would have thought there would be configuration in the git source to scan and check for changes. – Patrick Magee Oct 03 '19 at 15:55
  • This has been 2 years since this answer. I haven't found a more graceful way to do it. Wonder if anyone else has... – Roger Ray Dec 29 '20 at 10:11
  • I have a similar solution. I have a dry-run parameter. When this is given, I have some "light" run, in which I mostly check syntax. I use it also to refresh parameters. On the first run, when parameters are not available on GUI, it will do the work, but will exit error. – hagits Jan 19 '21 at 15:14
  • I also use a similar method. I have a "dry run" mode, and a "skip dry run mode", both of which are useful in their use cases. But if you enable both, it results in essentially a "no-op" run. – Max Cascone Sep 20 '22 at 19:27
6

Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:

Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.

Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).

pipeline {
    agent any
    parameters {
        booleanParam(name: 'Refresh',
                    defaultValue: false,
                    description: 'Read Jenkinsfile and exit.')
    }
    stages {
        stage('Read Jenkinsfile') {
            when {
                expression { return params.Refresh == true }
            }
            steps {
              echo("stop")
            }
        }
        stage('Run Jenkinsfile') {
            when {
                expression { return params.Refresh == false }
            }
            stages {
              stage('Build') {
                  steps {
                    echo("build")
                  }
              }
              stage('Test') {
                  steps {
                    echo("test")
                  }
              }
              stage('Deploy') {
                  steps {
                    echo("deploy")
                  }
              }
            }
        }
    }
}

applied to Jenkins 2.233

fty4
  • 568
  • 9
  • 18
4

Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.

I overcome this automatically using Jenkins Job DSL plugin.

I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.

pipelineJob('myJobName') {
    // sets RELOAD=true for when the job is 'queued' below
    parameters {
        booleanParam('RELOAD', true)
    }

    definition {
        cps {
            script(readFileFromWorkspace('Jenkinsfile'))
            sandbox()
        }
    }

    // queue the job to run so it re-downloads its Jenkinsfile
    queue('myJobName')
}

Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.

Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)

As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.

properties([
    parameters([
        string(name: 'PARAM1', description: 'my Param1'),
        string(name: 'PARAM2', description: 'my Param2'),
    ])
])

pipeline {
    agent any
    stages {
        stage('Preparations') {
            when { expression { return params.RELOAD == true } }
            // Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
            steps {
                script {
                    if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
                        currentBuild.displayName = 'Parameter Initialization'
                        currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches.  A second run has been triggered automatically.'
                        currentBuild.result = 'ABORTED'

                        error('Stopping initial build as we only want to get the parameters')
                    }
                }
            }
        }

        stage('Parameters') {
            steps {
                echo 'Running real job steps...'                
            }
        }
}

End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.

enter image description here

There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.

Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.

I initially tried @TomDotTom's answer but than I didn't liked manual effort.

Mior
  • 821
  • 8
  • 16
3

The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.

Christopher Orr
  • 110,418
  • 27
  • 198
  • 193
  • 1
    Based on you experience, any suggestion/trick in mind? e.g. I m hoping to have a large number of jobs depending on the jenkinsfile, so I need a way to tell each of these jobs to reload the updated parameter(s) owhen I update my it/them, and do not wish to run & cancel each individual task. – NicolasW Jun 08 '17 at 13:40
  • 1
    Why is necessary to execute it? Checkout and parse the JenkinsFile should be enough. – angelcervera Jun 05 '19 at 07:55
  • @angelcervera By default jenkins uses the Jenkinsfile from the previous run. This doesn't really make much sense and is outright incorrect, but it's done that way for efficiency. Supposedly there's also a box in that area of the UI that you can uncheck to fix this problem. – interestedparty333 Oct 28 '20 at 20:16
2

Scripted pipeline workaround - can probably make it work in declarative as well.

Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it. Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.

node('master') {
    checkout scm
    if (checkJenkinsfileChanges()) {
        return // exit the build immediately 
    } 
    echo "build" // build stuff
}

private Boolean checkJenkinsfileChanges() {
    filesChanged = getChangedFilesList()
    jenkinsfileChanged = filesChanged.contains("Jenkinsfile")

    if (jenkinsfileChanged) {
        if (filesChanged.size() == 1) {
            echo "Only Jenkinsfile changed, quitting"
        } else {
            echo "Rescheduling job with updated Jenkinsfile"
            build job: env.JOB_NAME 
        }
    }
    return jenkinsfileChanged
}

// returns a list of changed files
private String[] getChangedFilesList() {
    changedFiles = []
    for (changeLogSet in currentBuild.changeSets) { 
        for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
            for (file in entry.getAffectedFiles()) {
                changedFiles.add(file.getPath()) // add changed file to list
            }
        }
    }
    return changedFiles
}
Itay Sued
  • 134
  • 3
0

I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code

To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.

jobs.yaml

- job:
      name: 'job-name'
      description: 'deploy template'
      concurrent: true
      properties:
        - build-discarder:
            days-to-keep: 7
        - rebuild:
            rebuild-disabled: false
      parameters:
        - choice:
            name: debug
            choices:
              - Y
              - N
            description: 'debug flag'
        - string:
            name: deploy_tag
            description: "tag to deploy, default to latest"
        - choice:
            name: deploy_env
            choices:
              - dev
              - test
              - preprod
              - prod
            description: "Environment"
      project-type: pipeline
      # you can use either DSL or pipeline SCM
      dsl: |
          node() {
             stage('info') {
               print params
             }
          }
      # pipeline-scm:
      #   script-path: Jenkinsfile
      #   scm:
      #     - git:
      #         branches:
      #           - master
      #         url: 'https://repository.url.net/x.git'
      #         credentials-id: 'jenkinsautomation'
      #         skip-tag: true
      #         wipe-workspace: false
      #   lightweight-checkout: true

config.ini

[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080

command to load / update the job

jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml

Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.

This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Samit Kumar Patel
  • 1,932
  • 1
  • 13
  • 20