3

I am new in docker world.I am trying to understand Docker concepts about parent images. Assume that I want to run my django application on docker. I want to use ubuntu and python, I want to have postgresql as my database backend, and I want to run my django application on gunicorn web server. Can I have different base image for ubuntu, python, postgres and gunicorn and create my django container like this:

FROM ubuntu
FROM python:3.6.3
FROM postgres
FROM gunicorn
...

I am thinking about having different base image because if someday I want to update one of these image, I only have to update base image and not to go into ubuntu and update them.

Saber Solooki
  • 1,182
  • 1
  • 15
  • 34

4 Answers4

3

You can use multiple FROM in the same Dockerfile, provided you are doing a multi-stage build

One part of the Dockerfile would build an intermediate image used by another.

But that is generally use to cleanly separate the parents used for building your final program, from the parents needed to execute your final program.

VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
2

No, you can not create your image like this, the only image that will treat as the base image in the Dockerfile you posted will be the last FROM gunicor. what you need is multi-stage builds but before that, I will clear some concept about such Dockerfile.

A parent image is the image that your image is based on. It refers to the contents of the FROM directive in the Dockerfile. Each subsequent declaration in the Dockerfile modifies this parent image. Most Dockerfiles start from a parent image, rather than a base image. However, the terms are sometimes used interchangeably.

But in your case, I will not recommend putting everything in one Dockerfile. It will kill the purpose of Containerization.

Rule of Thumb

One process per container

Each container should have only one concern

Decoupling applications into multiple containers makes it much easier to scale horizontally and reuse containers. For instance, a web application stack might consist of three separate containers, each with its own unique image, to manage the web application, database, and an in-memory cache in a decoupled manner.

dockerfile_best-practices

Apart from Database, You can use multi-stage builds

If you use Docker 17.05 or higher, you can use multi-stage builds to drastically reduce the size of your final image, without the need to jump through hoops to reduce the number of intermediate layers or remove intermediate files during the build.

Images being built by the final stage only, you can most of the time benefit both the build cache and minimize images layers.

Your build stage may contain several layers, ordered from the less frequently changed to the more frequently changed for example:

  • Install tools you need to build your application

  • Install or update library dependencies

  • Generate your application

use-multi-stage-builds

enter image description here

With the multi-stage build, The Dockerfile can contain multiple FROM lines and each stage starts with a new FROM line and a fresh context. You can copy artifacts from stage to stage and the artifacts not copied over are discarded. This allows to keep the final image smaller and only include the relevant artifacts.

Adiii
  • 54,482
  • 7
  • 145
  • 148
  • Thank's for your answer. but still I can't understand the usage of multi FROM lines. If I should use different container for each service and connect them with compose file (if I get this is the right way), why I need another FROM? – Saber Solooki Sep 03 '19 at 06:15
  • Yes, that is the right way in your case there is no such thing that can be multistage. The multi-stage build is helpful when you want `some artifacts` from base image not all the artificats to make the final image smaller. You just copy some layer not the whole content from the base image. Where you need just runtime python so why multistage then? as python offical manage thier image. `— the possibilities are endless. But multi-stage builds don’t always make sense.` can read further here https://medium.com/capital-one-tech/multi-stage-builds-and-dockerfile-b5866d9e2f84 – Adiii Sep 03 '19 at 06:26
  • thank you. If I understand correctly, we use mulit-stage when each of stage do different things. For example in java application, one stage build and compile application and in second stage we copy compiled file to final image. Is that right? – Saber Solooki Sep 03 '19 at 06:41
  • Yes in this case it makes sense If compilation failed so no final image will be produced in such cases multistage make sense. – Adiii Sep 03 '19 at 06:43
  • In django application we use python parent and for example use pip to get requirements and in second stage we copy whole project into final image. Am I right? – Saber Solooki Sep 03 '19 at 06:44
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/198862/discussion-between-adiii-and-saber-solooki). – Adiii Sep 03 '19 at 06:44
1

Is it possible? Yes, Technically multiple base images (FROM XXXX) can appear in single docker file. But it is not for what you are trying to do. They are used for multi-stage builds. You can read more about it here.

The answer to your question is that, if you want to achieve this type of docker image, you should use one base image and install everything else in it with RUN commands like this.

FROM ubuntu

RUN apt install postgresql # install postgresql

...

Obviously it is not that simple. base ubuntu image is very minimal you have to install all dependencies and tools needed to install python, postgres and gunicorn yourself with RUN commands. For example, if you need to download python source code using

RUN wget https://www.python.org/ftp/python/3.7.4/Python-3.7.4.tgz 

wget (most probably) is not pre installed in ubuntu image. You have to install it yourself.

Should I do it? I think you are going against the whole idea of dockerization of apps. Which is not to build a monolithic giant image containing all the services, but to divide services in separate containers. (Generally there should be one service per container) and then make these containers talk to each other with docker networking tools. That is you should use one container for postgres one for nginx and one for gunicorn, run them separately and connect them via network.There is an awesome tool, docker-compose, comes with docker to automate this kind of multi-container setup. You should really use it. For more practical example about it, please read this good article.

Nafees Anwar
  • 6,324
  • 2
  • 23
  • 42
1

You can use official docker image for django https://hub.docker.com/_/django/ . It is well documented and explained its dockerfile.

If you wants to use different base image then you must go with docker-compose. Your docker-compose.yml will look like

    version: '3'

    services:
    web:
        restart: always
        build: ./web
        expose:
        - "8000"
        links:
        - postgres:postgres
        - redis:redis
        volumes:
        - web-django:/usr/src/app
        - web-static:/usr/src/app/static
        env_file: .env
        environment:
        DEBUG: 'true'
        command: /usr/local/bin/gunicorn docker_django.wsgi:application -w 2 -b :8000

    nginx:
        restart: always
        build: ./nginx/
        ports:
        - "80:80"
        volumes:
        - web-static:/www/static
        links:
        - web:web

    postgres:
        restart: always
        image: postgres:latest
        ports:
        - "5432:5432"
        volumes:
        - pgdata:/var/lib/postgresql/data/

    redis:
        restart: always
        image: redis:latest
        ports:
        - "6379:6379"
        volumes:
        - redisdata:/data

    volumes:
    web-django:
    web-static:
    pgdata:
    redisdata:

follow this blog for details https://realpython.com/django-development-with-docker-compose-and-machine/

Saleem Ali
  • 1,363
  • 11
  • 21