Consider a project that contains two images that interact. Good practice seems to be to structure the project such that they are in separate directories, each containing a Dockerfile, and a docker-compose at the top level:
project_package/
├── docker-compose.yml
├── image1/
│ ├───── __init__.py
│ ├───── Dockerfile1
│ ├───── image1_1.py
│ └───── image1_2.py
│
└── image2/
├───── __init__.py
├───── Dockerfile2
├───── image2_1.py
└───── image2_2.py
Now suppose that there is some common code that both images depend upon (in Python parlance - there's a module that they both import from). I see a few approaches here:
- Duplicate that code into the directories
image1
andimage2
, and build directly. Undesirable because then I need to update in two places when I changecommon/
.
project_package/
├── docker-compose.yml
├── image1/
│ ├───── common/
│ │ ├───── common1.py
│ │ └───── common2.py
│ ├───── __init__.py
│ ├───── Dockerfile1
│ ...
│
└── image2/
├───── common/
│ ├───── common1.py
│ └───── common2.py
├───── __init__.py
├───── Dockerfile2
...
- Have three top-level directories -
image1
,image2
, andcommon
. Have Dockerfile1 and Dockerfile2 copy files from thecommon
directory into the image. Doesn't work as-imagined - Docs say that "The path must be inside the context of the build", though there might be a way to make this work?
project_package/
├── docker-compose.yml
├── image1/
│ ├───── __init__.py
│ ├───── Dockerfile1
│ ...
│
├── image2/
│ ├───── __init__.py
│ ├───── Dockerfile2
│ ...
│
└── common/
├───── common1.py
└───── common2.py
# image1/Dockerfile1
FROM python:3.8-slim-buster
RUN mkdir common
COPY ../common/ common
...
- Three directories as above, but place the Dockerfiles in the root directory rather than in the per-image directories so that the
COPY
instructions can "see" thecommon/
directory. Contravenes the apparent best practice of colocating the Dockerfiles in relevant directories, but perhaps good otherwise?
project_package/
├── docker-compose.yml
├── Dockerfile1
├── Dockerfile2
├── image1/
│ ├───── __init__.py
│ ├───── image1_1.py
│ ...
│
├── image2/
│ ├───── __init__.py
│ ├───── image2_1.py
│ ...
│
└── common/
├───── common1.py
└───── common2.py
- Use a multi-stage build, and reference the built
common
image in the main image Dockerfiles:
<Same layout as in 2>
# .github/workflows/main.yml
...
steps:
- name: Build and push Common
id: docker_build_common
uses: docker/build-push-action@v2
with:
context: common/
file: common/Dockerfile
push: true
tags: ${{ secrets.DOCKER_HUB_USERNAME }}/common-package:latest
...
# image1/Dockerfile
FROM python:3.8-slim-buster
RUN mkdir common
COPY --from=<username>/common-package:latest *.py common/
...
- (Overkill option, at least for this level of project) export the common code to a standalone published module, depend on it via standard code-dependency mechanisms (e.g.
requirements.txt
/pip for Python)
What would be your preferred method? Is there an approach that I'm missing?