0

When running pip within docker, I've found out from many excellent SO answers how to cache downloading of packages, i.e. for example

# syntax=docker/dockerfile:experimental
...

RUN --mount=type=cache,mode=0755,id=pip,target=/root/.cache/pip \
    pip3 install -r requirements.txt --target /root/.cache/pip

This caches the downloads appearing in requirements.txt, but not the installation itself. The problem is that if I update requirements.txt with a new package - assuming no shared dependencies change - one only saves on download time (which in this case is only about half the docker build time).

Now I could just add new packages one-by-one in the dockerfile, but then I have to circle back round for the inevitable squash into a single RUN line - the time hit is unavoidable if one wants to simplify the dockerfile and minimize the image size.

So, is there a way to speed up the build even further by caching more than just the package downloads?

Thanks as ever!

REFS:

Using a pip cache directory in docker builds

How to avoid reinstalling packages when building Docker image for Python projects?

How to cache downloaded PIP packages

jtlz2
  • 7,700
  • 9
  • 64
  • 114
  • May I recommend Itamar Turner-Trauring's excellent [set of articles on Docker+Python](https://pythonspeed.com/docker), especially [Fast builds, small images](https://pythonspeed.com/docker#fast-builds-small-images) section? – phd May 16 '23 at 10:19

0 Answers0