0

I am facing an issue with running my Python code in a Docker container. The problem is that I have a slow internet connection on my server, which causes the pip install -r requirement.txt command to take a long time and even raise a timeout error.

To overcome this issue, I decided to copy my local Python environment into the Docker container and run the code using that environment. However, I encountered a ModuleNotFoundError indicating that the required module (in this case, fastapi) is not found within the container.

I have created a Dockerfile with the following content:

FROM python:3.10-slim
WORKDIR /app/
COPY . /app
ENV PATH="/app/envcache/bin:$PATH"
CMD ["python", "app.py"]
Mostafa Najmi
  • 315
  • 1
  • 4
  • 11

2 Answers2

0

From your description it sounds like you have FastAPI in a virtual env. You're not activating it. However, that may or may not work anyway depending on if any dependencies have native code, because you presumably installed them on your machine before copying, and your machine may or may not have the same libc, architecture, or even OS, as the image.

When building a Docker image, the build layers should be cached as long as nothing has changed, so if you use poetry you'll get hashes that would likely prevent a new layer (and new install step) there.

You could also save time by using one of the FastAPI base images, though depending on what other dependencies you have this might not be practical.

Zac Anger
  • 6,983
  • 2
  • 15
  • 42
  • Thanks. I tested `asdkant/fastapi-hello-world` docker image and activate env using `CMD ["/bin/bash", "-c", "source env/bin/activate"]`. But still have `ModuleNotFoundError` for `requests` module. (there are lots of packages which I summarize to a simpler project to solve this issue.) – Mostafa Najmi Jul 31 '23 at 07:24
  • What about using poetry instead of pip? Or a pipfile.lock? This would ensure that the dependency installation layer stays cached because it wouldn't change between builds. – Zac Anger Jul 31 '23 at 19:02
0

Copying the virtual env you have created in your local filesystem doesn't guarantee that it will work in your Docker container. As @Zac Anger suggested, your machine and your Docker image may not share the same Python configuration, OS, etc. This also defeats the purpose of using Docker images to contain everything needed to run an application in an isolated filesystem.

Your best bet may be to keep using pip, but increasing the timeout to an acceptable range in which your server can download all the required packages (e.g. 100 secs, default timeout is 15 secs). You can define this timeout in seconds by either:

  • Setting the ENV variable PIP_DEFAULT_TIMEOUT 3
  • Adding the --timeout option when running pip install 4

Also, to speed up the build of your Docker image, you can use the Docker cache to prevent the perpetual re-execution of the dependency-installation, if your requirements.txt hasn't changed.

Assuming your requirements.txt file is located inside your build context root, you can first copy it to cache the installing step and ensure it only runs if it changes. Your Dockerfile can now look like this:

FROM python:3.10-slim

# Set pip default timeout to 100 secs, also disable version check
ENV PIP_DEFAULT_TIMEOUT=100 \
    PIP_DISABLE_PIP_VERSION_CHECK=1

WORKDIR /app/

# copy project requirement files to ensure they will be cached
COPY ./requirements.txt /app/requirements.txt

# install packages with pip
RUN pip install -r /app/requirements.txt

# now copy your app's source code
COPY . /app

CMD ["python", "app.py"]

Some final notes, since you are using virtual envs in your local filesystem, consider implementing a .dockerignore to prevent them from being copied into your image. If your virtual env is presumably named envcache, create a .dockerignore file in your build context root with this line:

envcache
amoralesc
  • 34
  • 4