This may be a duplicate of How to include files outside of Docker's build context? but...
I am looking to have a shared python codebase available as a local dependency across multiple apps (docker images/containers).
The directory structure looks like:
my-projects/
|-projectlib/
| |-__init__.py
| |-utils.py
|
|-my-app-1/
| |-app.py
| |-Dockerfile
| |-docker-compose.yaml
|
|-my-app-2/
|-app.py
|-Dockerfile
|-docker-compose.yaml
I'd like both app.py
's to "have" projectlib available to them inside the docker image. That is, when I call docker compose build
inside my-app-1
I want it to build as though the build context is:
|-my-app-1/
|-app.py
|-Dockerfile
|-docker-compose.yaml
|projectlib/
|-__init__.py
|-utils.py
This is because within app.py
I have
from projectlib.utils import connect_to_db
Etc..
Why? For a few reasons:
- I make changes to
projectlib
multiple times per day. I don't want to have one copy in each my-app-* folder for fear that they may get out of sync. - Ideally the
projectlib/
folder would be mapped/mounted to the containers - so that when I edit theutils.py
file it gets "hot-reloaded" by theapp.py
files that are currently running (these are streamlit & fastApi apps).
Note that this is all for my dev process. I'm not insane - however funny the idea of all the apps simultaneously crashing if I save the
utils.py
file with a syntax error, the deployment process is separate
Some alternatives:
- I could copy
projectlib/
into the build context via a bash script that then callsdocker compose build
Is this something that can be done?