3

What I understand so far: This makes sense, i pass a file in instead of hardcoding the script values

jobs:
  - script: >
      multibranchPipelineJob('configuration-as-code') {
          branchSources {
              git {
                  id = 'configuration-as-code'
                  remote('https://github.com/jenkinsci/configuration-as-code-plugin.git')
              }
          }
      }

jobs:
  - file: ./jobdsl/job.groovy

Current setup: In my jenkins helm values file under jcasc section I have the following. This is good and it works along with other configurations that i am not showing here (not relevant to this discussion)

   JCasC:
      defaultConfig: true
      configScripts:
        pipeline-job: |
          jobs:
            - script: >
                multibranchPipelineJob('testrepo') {
                  branchSources {
                    git {
                      id('testrepo')
                      credentialsId('bitbucketv1')
                      remote('https://bitbucket.org/repo/test.git')
                      includes("master develop")
                      excludes("")
                    }
                  }

Issue:

How do I take my piece of code and pass it as a file like "- file: ./jobdsl/job.groovy". I created a folder inside my jenkins helm folder called jobdsl and add the job.groovy with the multibranchpipelinejob code snippet at start. But I get an error saying file does not exist.

Do I need to create a config map and load this file into jenkins first? Or is there a way to pass the file from local to the jcasc script in my helm chart snippet above? Also what is the correct format of job.groovy? does it include the -script: > part or just from multibranchpipelinejob and down?

shan
  • 125
  • 3
  • 16

1 Answers1

0

Unfortunately, I'm not sure this is possible. I was trying to figure out the same exact thing, to no avail. However, I'm going to document my steps here, in case I'm missing a step and someone can help fill it in.

I'm using the Jenkins Helm chart https://github.com/jenkinsci/helm-charts and have deployed my instance on Minkube.

Based on the available chart values which can be set, I added the following:

persistence:
enabled: true
accessMode: "ReadWriteOnce"
size: "100Gi"

volumes:
  - name: jacsc-jenkins-jobdsl-pipelines
    configMap:
      name: jacsc-jenkins-jobdsl-pipelines
mounts:
  - mountPath: /var/jenkins_home/jobdsl
    name: jacsc-jenkins-jobdsl-pipelines

The idea here was, since all CACS scripts get saved into the /var/jenkins_home/cacs_scripts folder (you can see this by digging into the helm chart code), I could mount my JobDSL code into the pod as well and reference it.

Here is the config map which I used to add my CACS code to the jenkins pod:

apiVersion: v1
kind: ConfigMap
metadata:
  name: jacsc-jenkins-jobs
  namespace: jenkins
  labels:
    "jenkins-jenkins-config": "true"
data:
  jacsc-jenkins-jobs.yaml: |
    jobs:
      - script: >
          folder('Tests')
      - file: ../../../../jobdsl/image-builder.groovy

However, to no avail. I tried referencing the file from the perspective of my mapped config map file, as well as from the actual file which seems to be the compiled Java code for the Job DSL plugin. Each time, it said it could not find the file.

Based on what I found out about File() used here (https://github.com/jenkinsci/job-dsl-plugin/blob/master/job-dsl-plugin/src/main/groovy/javaposse/jobdsl/plugin/casc/FromFileScriptSource.java) my assumption is that this looks up the path via the job workspace, and this is not the same as the Job DSL code on the pod.

If anyone finds anything else about this to prove that it DOES work, please tell!

user1015214
  • 2,733
  • 10
  • 36
  • 66