1

I want to use Jenkins Pipeline to build, push, and deploy my Docker image.

I get this:

Got permission denied while trying to connect to the 
Docker daemon socket at unix:///var/run/docker.sock

Other questions on StackOverflow suggest sudo usermod -a -G docker jenkins, then restart Jenkins, but I do not have access to the machine running Jenkins -- and anyway, it seems strange that Jenkins Pipeline, which is built all around Docker, cannot run a basic Docker command.

How can I build my Docker?

pipeline {
    agent  any
    stages {    
         stage('deploy') {
            agent {
                  docker {
                    image 'google/cloud-sdk:latest'
                    args '-v /var/run/docker.sock:/var/run/docker.sock'
                  }
            }
            steps {    
                 script {    
                    docker.build  "gcr.io/myporject/mydockerimage:1"
                } 
             }
          }
    }
}
Joshua Fox
  • 18,704
  • 23
  • 87
  • 147

1 Answers1

1

The pipeline definition shown is trying to execute the docker build inside a docker container (google/cloud-sdk:latest). Instead you should do the following given the jenkins user on the host has permission to execute docker commands on the host.

pipeline {
  agent  any
  stages {
    stage('deploy') {
      steps {    
        script {    
          docker.build  "gcr.io/myporject/mydockerimage:1"
        } 
      }
    }
  }
}

There is nothing strange about jenkins unable to execute docker commands without proper permission when they are installed and configured separately on the machine.

Josnidhin
  • 12,469
  • 9
  • 42
  • 61
  • What is the key difference here? Just the lack of the "agent { docker {...." block? – Joshua Fox Nov 27 '18 at 12:05
  • 1
    Yes. When you add an agent section inside a stage it will override the agent defined in the top level. – Josnidhin Nov 27 '18 at 12:56
  • Your solution is correct -- but I still wonder, why NOT run the build inside a Docker container. For example, that allows me to install build tools like kubectl and gcloud , and set their credentials, without worrying about disrupting the GCE VM it is running in. – Joshua Fox Dec 02 '18 at 06:50
  • 1
    I think it is a bad idea to have docker installer inside a docker container. You can probably split the stuff you are doing into multiple stages like build docker container on the host set some variables then in a different stage which can be inside a container do something with it. Also pipeline has `withCredentials` to handle credentials securely. – Josnidhin Dec 02 '18 at 13:41