1

I have 2 projects for different clients:

A. Ruby 2.7.2 and Rails 6

B. Ruby 2.5.8 and Rails 5.2

Each one has it's own docker-compose file and associated setup scripts, following the awesome EvilMartians blog post here.

I can build both projects fine and everything runs.

However, every time I switch projects, I have to rebuild, which seems to defeat the purpose of having self-contained docker containers.

This is because docker-compose run rails throws the error:

If I run Project A first, then switch to B:

Your Ruby version is 2.7.2, but your Gemfile specified 2.5.8

If I run Project B first, then switch to A:

Your Ruby version is 2.5.8, but your Gemfile specified 2.7.2

If I try to solve it by running the setup again, I get the same error. (ie. docker-compose run runner ./bin/setup)

So I have to run docker-compose build to force the images to rebuild using the ARG that picks up the correct ruby version. But surely that should not be necessary.

So Project A is remembering the ruby version of Project B and vice-versa.

It seems they are using the same underlying image that results in running the build command.

Does it make sense (is there even a way?) to have them NOT do this (share imges) - ie can I somehow tag the images so they are 'tied' to each project, based maybe on the ruby version?

Or is it standard practice to just rebuild?


THE FILE

docker-compose

Each project has its own (almost identical) docker-compose file, which you can see below (only the top part shown for brevity).

Obviously the thing that is different between A and B are the versions of the dependencies specified:

version: '2.4'

x-app: &app
  build:
    context: .dockerdev
    dockerfile: Dockerfile
    args:
      RUBY_VERSION: '2.5.8'
      PG_MAJOR: '13'
      NODE_MAJOR: '12'
      YARN_VERSION: '1.13.0'
      BUNDLER_VERSION: '1.17.2'
  environment: &env
    NODE_ENV: development
    RAILS_ENV: ${RAILS_ENV:-development}
  image: example-dev:1.1.0
  tmpfs:
    - /tmp

x-backend: &backend
  <<: *app
  stdin_open: true
  tty: true
  volumes:
    - .:/app:cached
    - rails_cache:/app/tmp/cache
    - bundle:/usr/local/bundle
    - node_modules:/app/node_modules
    - packs:/app/public/packs
    - .dockerdev/.psqlrc:/root/.psqlrc:ro
    - .dockerdev/.bashrc:/root/.bashrc:ro
  environment:
    <<: *env
    REDIS_URL: redis://redis:6379/
    DATABASE_URL: postgres://postgres:postgres@postgres:5432
    BOOTSNAP_CACHE_DIR: /usr/local/bundle/_bootsnap
    WEBPACKER_DEV_SERVER_HOST: webpacker
    WEB_CONCURRENCY: 1
    HISTFILE: /app/log/.bash_history
    PSQL_HISTFILE: /app/log/.psql_history
    EDITOR: vi
  depends_on:
    postgres:
      condition: service_healthy
    redis:
      condition: service_healthy

...services and volumes follow...

IMPORTANT NOTE

This is the same issue, but because the OP mentioned rbenv everyone bitched at him for not understanding docker, thinking he/she was using rbenv to manage different versions on his host system.

But that was not the OP's point. The OP's point was that docker-compose seemed to be ignoring the version specified in the compose file.

So let me state clearly:

  • the 2 projects are complaining about the ruby version of the OTHER project, not the version on my host system
  • neither project has a hidden .ruby-version (not needed since I'm using the dockerfile to specify the version)
  • I don't have any ruby version manager installed as I don't need them since I am using docker
  • my host system ruby version is 2.6.3, so clearly it's not complaining about that
rmcsharry
  • 5,363
  • 6
  • 65
  • 108

1 Answers1

1

As usual, 1 minute after posting! the light bulb fires.

I realised the images are both tagged with the same tag:

image: example-dev:1.1.0

which is the reason this happens. DOH!

So by setting different tags I can switch more quickly without having to rebuild or seeing the ruby version error.

rmcsharry
  • 5,363
  • 6
  • 65
  • 108