I'm trying to set up a jenkins multibranch pipeline to run all my code validation steps in a docker container, then build the docker image and push it outside of said docker container.
Currently, my Jenkinsfile looks sort of like this (slimmed down for readability):
pipeline {
agent {
label 'AWS'
}
stages {
stage('stuff in docker') {
agent {
dockerfile {
filename 'Dockerfile.jenkins'
reuseNode true
}
}
steps {
stuff
}
}
stage('more stuff in docker') {
agent {
dockerfile {
filename 'Dockerfile.jenkins'
reuseNode true
}
}
steps {
stuff
}
}
stage('stuff not in docker') {
steps {
stuff
}
}
stage('more stuff not in docker') {
steps {
stuff
}
}
}
post {
always {
cleanWs()
}
}
}
The problem here is that every stage that I use a dockerfile agent, jenkins will attempt to rebuild the docker image. The stages are all cached, but sending build context and actually processing everything still takes more time than I'd like. If I use the dockerfile as the root agent, I get to run everything inside the same docker container, but I lose the ability to do git stuff and build the docker image (at least without connecting to the external docker process sock, which seems like more hassle than should be necessary).
I'd like to know if there's some way I can use the same docker image for multiple steps, but then pull out of that docker image for some other steps.