3

I am pretty new to this whole subject, so excuse me if those are silly questions. I want to run unit tests in Docker containers like so:

  • Start up a Postgres container with no tables/data
  • Run the DB migration (I am using node-pg-migrate) to create all the tables
  • Populate the DB with test data
  • Start a Node container with my service
  • Run the unit tests
  • Drop the database
  • Shut down/delete all containers (except if there were errors)

I am currently struggling running the migration. I create the service image FROM my prod image and RUN npm install --only=dev. My docker-compose.yml looks like this:

version: "3.7"
services:
    db:
        image: postgres
        environment: 
            POSTGRES_PASSWORD: test_pw
            POSTGRES_USER: test_user
            POSTGRES_DB: test_db

        ports:
            - '5432:5432'

    web:
        image: docker_tests
        environment:
            DATABASE_URL: postgres://test_user:test_pw@db:5432/test_db?ssl=false
            DATABASE_SSL: "false"
            DATABASE_PW: test_pw
            DATABASE_USER: test_user
            DATABASE_NAME: test_db
            DATABASE_HOST: db
        depends_on: 
            - db
        ports:
            - '1337:1337'

And my Dockerfile:

FROM docker_service
ENV NODE_ENV TEST
WORKDIR /usr/src/app
RUN npm install --only=dev
RUN ["./node_modules/.bin/node-pg-migrate", "up"]
EXPOSE 1337
CMD ["npm", "run", "test"]

When running the composition, both containers start and I even get

No migrations to run!
Migration complete!

However, if I run tests, no tables are available. The migration was not applied to my Postgres container and when adding RUN ["printenv"] after migration it becomes clear why: the necessary DATABASE_URL is not there. I googled and found that the env-variables specified in docker-compose.yml are only available at runtime, not during buildtime. However, when i add ENV DATABASE_URL ... to my Dockerfile, of course it cannot connect to the database since the Postgres container hasn't yet started.

How do I solve the problem? One possible soltion would be to run ./node_modules/.bin/node-pg-migrate up in web as soon as both containers are up, but Docker can only have one CMD, right? And I use it to run my unit tests.

TL;DR: How do I run migrations in a Docker Postgres-container using node-pg-migrate from a Docker service-container before running unit tests?

Thanks a lot!

gizarmaluke
  • 55
  • 1
  • 6

2 Answers2

2

I couldn't test this, but here's the idea:

  • have a docker-entrypoint.sh script in the same folder as Dockerfile
#!/usr/bin/env bash

./node_modules/.bin/node-pg-migrate up
exec "$@"
  • change Dockerfile to use that script
FROM docker_service
ENV NODE_ENV TEST
WORKDIR /usr/src/app
RUN npm install --only=dev

COPY docker-entrypoint.sh /usr/local/bin/
RUN chmod 777 /usr/local/bin/docker-entrypoint.sh && \
    ln -s usr/local/bin/docker-entrypoint.sh / # backwards compat

ENTRYPOINT ["docker-entrypoint.sh"]
EXPOSE 1337
CMD ["npm", "run", "test"]

This way the docker-entrypoint.sh script will be executed every time you create container and will execute npm run test afterwards because of exec "$@". Every DB image has this kind of setup and I advise you to take a look at the PostgreSQL Dockerfile and entrypoint script.


You might need something like wait-for-it to make sure the DB has started, before executing migration scripts. depends_on helps you with start order but doesn't wait for the service to be healthy.

Resource: Control startup and shutdown order in Compose

Stefan Golubović
  • 1,225
  • 1
  • 16
  • 21
  • Thanks for your response! If I do it like that the migration does not run at all. I added an `echo test` to the .sh-file and did not get the echo when running it. The tests, however, run. Also, I added a retry-logic to my db-connection which should take of the db not being available at first. I am gonna look into the links you postet tomorrow. Unfortunately, my Linux/Bash-know-how isn't great either, so I have no idea what `exec $@` does. Googled it but did not understand what I found, heh. – gizarmaluke Mar 25 '20 at 18:09
  • Yes, you're right, I made some mistakes. I've corrected some of them in the answer, but I've also created [MWE on GitHub](https://github.com/g0loob/node-pg-migrate-test). As for `exec "$@"`, `exec` is used in bash to execute command that was passed to it and `$@` is short for _give me all passed arguments_. In this example, we have `docker-entrypoint.sh` script that is executed like `bash docker-entrypoint.sh npm run test`, so after the script finishes, `npm run test` will be executed. – Stefan Golubović Mar 25 '20 at 20:32
  • Thanks a lot, that seems to have worked! What exactly was the problem before? I see that you added a `COPY` statement - is that really necessary? My service image gets build everytime I start a test, copying . to ., thus, the file should be available, no? So I guess the problem was the missing `chmod`? I googled it and found that this is for permission granting? Why did I not get a missing permission error before then? I just try to understand what I did wrong and what made it work now so I can learn from that. Thanks again! :) – gizarmaluke Mar 26 '20 at 11:30
  • In my case, I forgot to copy the script in container, so no script could be executed. If you `COPY . .` then your scirpt is in `/usr/src/app` and I'm guessing that path is not in your `$PATH`, but you can update your `PATH` with `ENV PATH="/usr/src/app:${PATH}"`. That's why it was copied in `/usr/local/bin`. Of course, if you want to run some script/file it needs to be executable so you need to add execution permission (`x`). I copied command from PostgreSQL Dockerfile so it's adding all permissions - 777 means add all permissions (read, write, execute) for everyone. – Stefan Golubović Mar 26 '20 at 11:57
  • Two additional questions: 1. Why do we copy the docker-entrypoint.sh into /usr/local/bin and not into /usr/src/app? 2. Is the ln command only to make it also work on older versions of Linux? – gizarmaluke Mar 27 '20 at 11:24
  • 1. You can copy into `/usr/src/app` but that path needs to be part of `PATH` environment variable, because when you type some command in Linux, OS tries to find that executable in one of the directories that are part of `PATH` (you can `exec` into the container and do `echo $PATH`). 2. I need to do research on that because I don't know at the moment. As I said, I copied that line from PostgreSQL's Dockerfile. – Stefan Golubović Mar 27 '20 at 16:08
1

I might be late for the conversation but the following step works for me

docker-compose.yml

    version: "3.7"
    services:
        db:
            #same as OP
        web:
            #same as OP
            
    migration:
        build:
          context: .
        command:
          [
            "./wait-for-it/wait-for-it.sh",
            "db:5432",
            "--",
            "npm",
            "run",
            "migrate",
            "up"
          ]
        links:
          - db
        depends_on:
          - db
        environment:
        - DATABASE_URL=postgres://username:password@host:5432/dbname

Dockerfile

    FROM docker_service
    ## optional 
    #RUN apk update && apk add git
    #RUN /bin/sh -c "apk add --no-cache bash"
    
    ENV NODE_ENV TEST
    WORKDIR /usr/src/app
    RUN npm install --only=dev
    RUN git clone https://github.com/vishnubob/wait-for-it.git
    EXPOSE 1337
    CMD ["npm", "run", "test"]

Add "migrate": "node-pg-migrate" in script section of package.json

Reference:

https://javascript.plainenglish.io/run-migrations-using-docker-in-node-and-pg-c6d80e7cd578

https://github.com/vishnubob/wait-for-it

mirsahib
  • 375
  • 1
  • 4
  • 12
  • Perfect and thank you for the good solution. some notes about this: *node-alpine doesn't support git by default so I used node image to make the git clone work. – Ali Bayatpour Dec 28 '22 at 08:52