2

I am trying to setup docker-compose architecture for local development and production and I can't figure when in the containers life it's the best time to install library dependencies. In the same time I am not sure if these should be placed in the container or in external volume.

All my code is mounted in external volumes, so that changes are immidiately taken into without rebuilding the containers, but I am not sure about libraries that need to be installed by pip (I am running python backend) and npm/yarn (for webpack front-end).

Placing requirments.txt and package.json into the containers and running pip install and yarn install in the container build process means that I have to rebuild the container any time dependecies change - that is too much overhead.

Putting them in an external volume and running pip install and yarn install as part of the command of each container when it is started seems to solve the issue.

The build process of each container then contains only platform dependencies (eg. installing python, webpack or other platform tools), but libraries are installed after started (with CMD directive).

Is this the correct approach? I have seen lot of examples doing exactly the oposite and running npm install in the build process of the container - but I don't see any advantage for that, am I missing something?

hoonzis
  • 1,717
  • 1
  • 17
  • 32
  • 1
    Dependencies would normally be installed inside the container at build time, otherwise your production image will change over time. The development problem can be solved by mounting your local files into the image. [A node example](https://stackoverflow.com/a/40921548/1318694) – Matt Nov 12 '17 at 12:08
  • Thanks. The issue with this approach is that if your dependencies change often, than you have to rebuild the container. @yamenk bellow provided an interesting idea of listing the pip install dependendencies directly to use layers caching. – hoonzis Nov 16 '17 at 12:15

1 Answers1

2

Installing dependecies is usually part of the build process. Mounting code is a good trick when developing in order to get changes directly reflected.

Concerning adding requirements.txt or package.json. Installing dependecies takes time, and for that you need to take advantage of docker layer caching. In particular, you want to avoid cache invalidation.

For pip I suggest the following in development phase: For dependencies that you are unlikely to change, install these in separate RUN instuction. Your Docker file will look something like.

FROM ..
RUN pip install package1 package2 package3 ...
ADD requirements.txt requirements.txt
RUN RUN pip install -r requirements.txt
...

Keep only dependencies that might be changed in requirements.txt. Once you are done developing, add the packages back to the requirements.txt and build using the requirements file.

A similar approach would be adding two requirements files, and at the end combining them.

yamenk
  • 46,736
  • 10
  • 93
  • 87
  • Thanks for the idea of using docker layers caching for pip install. The same can be probably applied for npm or any other tool. One just needs a set of "stable" dependencies and "volatilite" dependencies which change often. – hoonzis Nov 16 '17 at 12:11