Context
I have a few Jenkins jobs stored as declarative pipelines in a given repository. Then, I'm using Jenkins Configuration as code configuration YAML as follows to load them from that repository:
jobs:
- script: >
pipelineJob('job-A') {
definition {
cpsScm {
scm {
git {
remote {
url('https://www.my-company.com/jenkins-jobs-repo.git')
credentials('creds')
}
branch('*/main')
}
}
scriptPath('jenkins/pipelines/pipeline1.groovy')
lightweight()
}
}
}
- script: >
pipelineJob('job-B') {
definition {
cpsScm {
scm {
...
So far, it works nicely: When Jenkins is restarted or re-created, the configuration YAML is loaded and the jobs are created as expected.
Problem
The problem I'm facing is that some of these loaded jobs have either parameters, or either a cron schedule, for example:
pipeline {
agent any
triggers { cron('H H */1 * 1-5') }
options {
When CASC plugin loads the jobs and creates them, the jobs do not have any schedule and are un-parameterized. This causes that the user has to trigger the created jobs at least once to schedule them and pick the parameters (as running them will make Jenkins to fetch the pipeline code and update both the cron and params configuration).
I'd like to avoid this situation, and having CASC creating the jobs already scheduled and with parameters, without human interaction.
Proposed solutions
The first idea I had is to simply duplicate the cron/parameter declarations in the CASC YAML, as the job-dsl plugin allows you to specify those in the job definitions. But duplicating these doesn't feel right as it would be prone to errors (one could update the groovy files and forget to update the YAML).
Another option would be just removing the cron and parameters from the pipelines and put them only in the CASC yaml. Is this the way to go? What is the best practice here? This option would probably fix everything, but shouldn't the cron/params info be located in the pipeline?