1

I'm having some issues setting up a docker image that will be eventually used on bitbucket pipelines to run tests for some API projects I work on.

The setup that our local developers have is pretty simple, Java 8 + Maven + PostgreSQL 9.6 running on the machine.

For the integration tests to run the DB has to be running. And part of the maven build creates the necessary databases and tables on the server.

I tried to replicate this on a docker image and this is what I have so far.

FROM maven:3.5.3-jdk-8-slim

#Install postgresql
RUN apt update && \
    mkdir -p /usr/share/man/man1 &&\
    mkdir -p /usr/share/man/man7 &&\
    apt install -y postgresql-9.6

#Update config
RUN echo "host  all all 127.0.0.1/32  trust" >> /etc/postgresql/9.6/main/pg_hba.conf

#Start server
RUN service postgresql start

#Create readonly role for DB
USER postgres
RUN psql -c "CREATE ROLE readonly"

These steps appear to work just fine if I run them manually inside the container via first run:

docker run -it maven:3.5.3-jdk-8-slim /bin/bash

But when I try to build the image, this is what I get:

▶ docker build -t pipelines .
Sending build context to Docker daemon  2.048kB
Step 1/6 : FROM maven:3.5.3-jdk-8-slim
 ---> 25f97112c73f
Step 2/6 : RUN apt update &&     mkdir -p /usr/share/man/man1 &&    mkdir -p /usr/share/man/man7 &&    apt install -y postgresql-9.6
 ---> Using cache
 ---> 5fa381f73c9d
Step 3/6 : RUN echo "host  all all 127.0.0.1/32  trust" >> /etc/postgresql/9.6/main/pg_hba.conf
 ---> Using cache
 ---> 0721966e7749
Step 4/6 : RUN service postgresql start
 ---> Using cache
 ---> 9ca8e7a270e0
Step 5/6 : USER postgres
 ---> Using cache
 ---> 50a99c6cac20
Step 6/6 : RUN psql -c "CREATE ROLE readonly"
 ---> Running in 572b2b8fa754
psql: could not connect to server: No such file or directory
    Is the server running locally and accepting
    connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
The command '/bin/sh -c psql -c "CREATE ROLE readonly"' returned a non-zero code: 2

I've read some stuff about this means that the container is trying to access the port on the host machine, not the container. But I can't understand what the difference is between the running the commands manually and letting the Dockerfile "run" them.

I've also tried building it from a postgres:9.6 base image and then installing java+maven but when I need to create the role on the DB, the issue is the same.

There is something that I'm doing wrong here, given that my intention is to use this on bitbucket pipelines.

namokarm
  • 668
  • 1
  • 7
  • 20
Pedro Garcia Mota
  • 928
  • 2
  • 10
  • 20

1 Answers1

3

Use RUN when you're building an image, for example to install postgress. Each RUN builds a layer on top of previous RUN instructions.

Use CMD when you want to execute a command by default, when you run an image (create a container). Therefore when you're running that command inside a container that would be equivalent to CMD in Dockerfile.

I would recommend using a script that would run your commands as an ENTRYPOINT that would create roles, tables, insert test data... In a Dockerfile you usually have multiple RUN instructions that build on top of the last layer, but CMDoverrides previous CMD instructions.

There can only be one CMD instruction in a Dockerfile. If you list more than one CMD then only the last CMD will take effect.

I would also encourage you to read this answer explaining RUN, CMD and ENTRYPOINT.

Summary

RUN instruction actually runs the command during image build

CMD instruction is not executed druing build time, it is executed when the container starts unless the user provides a command

namokarm
  • 668
  • 1
  • 7
  • 20
  • 1
    That was it! thanks. I will take a look into incorporating the script once i have something that works, then i will think about cleaning up – Pedro Garcia Mota Jun 14 '18 at 06:23