1

i'd like to slip in a jupyter server on my dockerized airflow so that i can develop more easily in that environment. i see from Using Docker-Compose, how to execute multiple commands. that you can easily run multiple commands in docker-compose with a 'bash -c' .

here is what i've tried:

command: bash -c "airflow webserver ; nohup jupyter notebook --ip 0.0.0.0  --no-browser &"

also tried:

command: bash -c "airflow webserver && nohup jupyter notebook --ip 0.0.0.0  --no-browser &"

i know this is possible because i can do something like:

docker exec -it -u airflow  8b2 jupyter notebook --ip 0.0.0.0  --NotebookApp.token='airflow' --no-browser

this of course required adding the port 8888:8888 mapping for j:

ports:
  - ${AIRFLOW_WEBSERVER_PORT:-8080}:8080
  - 8888:8888

thanks!

2 Answers2

1

deploying more than one application inside a container is not a best practice. each container should do one thing and do it well. reference

You can do the following:

  1. Separate Airflow and Jupyter into separate containers.
  2. Use a Docker network to connect the two.

Let me show you this project that satisfies most of your needs and works well for me. Have a look at it. You might be inspired by it.

Numb95
  • 103
  • 2
1

As @Numb95 points out, the problem is best solved by specifying a dedicated jupyter container.

The 'quick-start' docker-compose at this time is: https://airflow.apache.org/docs/apache-airflow/stable/docker-compose.yaml.

Using the quick-start docker-compose above as a guid, the code below can be placed in docker-compose.yaml, but would need the airflow image specified within to be custom compiled with jupyter-core and jupyter:

  airflow-jupyter:
    <<: *airflow-common
    #image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.2.2}
    image: <CUSTOM_IMAGE_DESCRIBED_ABOVE>
    container_name: airflow_jupyter
    command: bash -cx "jupyter notebook --ip 0.0.0.0  --NotebookApp.token='airflow' --no-browser"
    ports:
      - 8888:8888
    restart: always
    depends_on:
      <<: *airflow-common-depends-on
      airflow-init:
        condition: service_completed_successfully

To install the jupyter packages, the DockerFile would need to be rebuilt to include:

RUN pip install -U jupyter-core --user
RUN pip install -U jupyter --user
RUN chmod -R 775 /home/airflow/.local/share/jupyter
  • Just to add the "docker-composer" is just a "quick-start", it's not "official way of runnning airflow in Docker Compose". The property of Docker Compose is that you should pretty much always write your own docker compose for your own needs. It's impossible to write and make docker-compose an "extendable" one. You are mostly on your own to modify it as you need and you need to become a "docker-compose expert" to get some custom behaviours out of it – Jarek Potiuk Dec 04 '21 at 22:54
  • fair point regarding docker-compuse being a quick start. thanks for the correction. I've changed the word 'official' in the answer above to 'quick-start' – Jason Anderson Dec 15 '21 at 18:15