9

I have BigQuery connectors all running, but I have some existing scripts in Docker containers I wish to schedule on Cloud Composer instead of App Engine Flexible.

I have the below script that seems to follow the examples I can find:

import datetime
from airflow import DAG
from airflow import models
from airflow.operators.docker_operator import DockerOperator

yesterday = datetime.datetime.combine(
    datetime.datetime.today() - datetime.timedelta(1),
    datetime.datetime.min.time())

default_args = {
    # Setting start date as yesterday starts the DAG immediately
    'start_date': yesterday,
    # If a task fails, retry it once after waiting at least 5 minutes
    'retries': 1,
    'retry_delay': datetime.timedelta(minutes=5),
}

schedule_interval = '45 09 * * *'

dag = DAG('xxx-merge', default_args=default_args, schedule_interval=schedule_interval)

hfan = DockerOperator(
   task_id = 'hfan',
   image   = 'gcr.io/yyyyy/xxxx'
 )

...but when trying to run it tells me in the web UI:

Broken DAG: [/home/airflow/gcs/dags/xxxx.py] No module named docker

Is it perhaps that the Docker is not configured to work inside the Kubernetes cluster that Cloud Composer runs? Or am I just missing something in the syntax?

MarkeD
  • 2,500
  • 2
  • 21
  • 35
  • Does this answer your question? [Running docker operator from Google Cloud Composer](https://stackoverflow.com/questions/51185485/running-docker-operator-from-google-cloud-composer) – ricoms Feb 13 '20 at 17:05
  • 1
    It’s a couple of years since I asked this question :) these days I use KubernetesPodOperator instead. Installing docker or any other extra configuration on Airflow didn’t work out well – MarkeD Feb 14 '20 at 19:46

5 Answers5

8

I got it resolved by installing docker-py==1.10.6 in the PyPI section of composer.

However, to get DockerOperator to work properly requires a bit more effort as the composer workers do not have access to the Docker daemon. Head to the GCP console and perform the following steps; after getting cluster credentials).

  1. Export current deployment config to file

    kubectl get deployment airflow-worker -o yaml --export > airflow-worker-config.yaml

  2. Edit airflow-worker-config.yaml (example link) to mount docker.sock and docker, grant privileged access to airflow-worker to run docker commands

  3. Apply deployment settings

    kubectl apply -f airflow-worker-config.yaml

mchan
  • 81
  • 4
  • 1
    To new readers, this is not a recommended reconfiguration: https://stackoverflow.com/a/63428706/1380918. There is also no guarantee that patches to `airflow-worker` will persist if you update or upgrade your environment. Consider `GKEPodOperator` as the recommended solution for launching containers. – hexacyanide Aug 15 '20 at 17:07
5

This means: whereever your Airflow instance is installed, the Python package named docker is missing.

If I configure my personal machine, I can install missing packages with

pip install docker

EDIT

Within the source code of the docker component https://airflow.incubator.apache.org/_modules/airflow/operators/docker_operator.html

there is an import statement:

from docker import Client, tls

So the new error cannot import name Client seems to me to be connected to a broken install or a wrong version of the docker package.

tobi6
  • 8,033
  • 6
  • 26
  • 41
  • Ahh ok, so I should install that in the PyPi section! Will try it now, thanks! – MarkeD May 09 '18 at 12:15
  • Hmm well close but now after installing docker I get `Broken DAG: [...] cannot import name Client` - is this the same thing or a different issue? – MarkeD May 09 '18 at 13:10
  • Great thank you - I will try it with a new docker version and environment – MarkeD May 09 '18 at 13:31
  • Is it possible it should be installing `pip install docker-py` ? (no, its from 2016 https://pypi.org/project/docker-py/ ) – MarkeD May 09 '18 at 13:37
  • Have you seen this: https://stackoverflow.com/questions/43386003/airflow-inside-docker-running-a-docker-container – tobi6 May 09 '18 at 13:43
  • I have when I was installing airflow on my own server, but this is via Google Cloud Composer so I don't have server access - I do suspect that they may have to configure that though – MarkeD May 09 '18 at 13:52
  • `Client` was renamed to `DockerClient` in docker>=2.0.0 so I'm trying to install `docker==1.10.6` – MarkeD May 09 '18 at 14:53
  • This is still not working - the docker install fails when I try via its pip section, I'm waiting for some answer from Google on if it is supported – MarkeD May 16 '18 at 14:52
  • All python packages need to be installed into the virtual env which airflow is using (even docker) and not system wide/globablly for airflow to be able to access them – imsheth Feb 28 '21 at 13:13
1

As explained in other answers, the Docker Python client is not preinstalled in Cloud Composer environments. To install it, add it as a PyPI dependency in your environment's configuration.

Caveat: by default, DockerOperator will try to talk to the Docker API at /var/run/docker.sock to manage containers. This socket is not mounted inside Composer Airflow worker pods, and manually configuring it to do so is not recommended. Use of DockerOperator is only recommended in Composer if configured to talk to Docker daemons running outside of your environments.

To avoid more brittle configuration or surprises from bypassing Kubernetes (since it is responsible for managing containers across the entire cluster), you should use the KubernetesPodOperator. If you are launching containers into a GKE cluster (or the Composer environment's cluster), then you can use GKEPodOperator, which has more specific GCP-related parameters.

hexacyanide
  • 88,222
  • 31
  • 159
  • 162
1

What solved the problem in my case was adding the word "docker" inside the Dockerfile

&& pip install pyasn1 \
&& pip install apache-airflow[crypto,docker,celery,postgres,hive,jdbc,mysql,ssh${AIRFLOW_DEPS:+,}${AIRFLOW_DEPS}]==${AIRFLOW_VERSION} \
&& pip install 'redis==3.2' \
maleckicoa
  • 481
  • 5
  • 8
0

As noted in tobi6's answer, you need to have the PyPI package for docker installed in your Composer environment. There are instructions here for installing PyPI packages in your environment at a particular package version.

Wilson
  • 614
  • 4
  • 9