1

I am trying to run Airflow on Docker (using a MacOS, Version 12.6.2, with 8 GB of Memory). I have downloaded the docker-compose.yaml file here (apache/airflow:2.4.2), and have set my .env file to this:

AIRFLOW_IMAGE_NAME=apache/airflow:2.4.2
AIRFLOW_UID=50000

When I run docker-compose up -d, and wait, the webserver containers never become healthy:

enter image description here

As suggested by numerous people on MacOS, I have upped my Docker memory:

enter image description here

I have tried numerous combinations of Docker memory (all 8GB of Memory, 7 GB of Memory, 6 GB of Memory, 5 and 4 GB of Memory) as well as testing different combinations of CPU, Swap, and Virtual Disk Limit (I have not tried going higher than 160 GB in the Virtual Disk Limit). I have also seen that it is a bad idea to use all 4 CPUs, so I have not tried that.

Here is the log I get for the webserver container:

2023-01-12 03:26:30 [2023-01-12 11:26:30 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:215)
2023-01-12 03:26:31 [2023-01-12 11:26:31 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:216)
2023-01-12 03:26:33 [2023-01-12 11:26:32 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:217)
2023-01-12 03:26:33 [2023-01-12 11:26:33 +0000] [79] [WARNING] Worker with pid 215 was terminated due to signal 9
2023-01-12 03:26:34 [2023-01-12 11:26:34 +0000] [262] [INFO] Booting worker with pid: 262
2023-01-12 03:26:36 [2023-01-12 11:26:36 +0000] [79] [WARNING] Worker with pid 217 was terminated due to signal 9
2023-01-12 03:26:36 [2023-01-12 11:26:36 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:219)
2023-01-12 03:26:36 [2023-01-12 11:26:36 +0000] [263] [INFO] Booting worker with pid: 263
2023-01-12 03:26:37 [2023-01-12 11:26:37 +0000] [79] [WARNING] Worker with pid 216 was terminated due to signal 9
2023-01-12 03:26:38 [2023-01-12 11:26:38 +0000] [265] [INFO] Booting worker with pid: 265
2023-01-12 03:26:39 [2023-01-12 11:26:39 +0000] [79] [WARNING] Worker with pid 219 was terminated due to signal 9
2023-01-12 03:26:40 [2023-01-12 11:26:40 +0000] [266] [INFO] Booting worker with pid: 266
2023-01-12 03:28:34 [2023-01-12 11:28:33 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:262)
2023-01-12 03:28:36 [2023-01-12 11:28:36 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:263)
2023-01-12 03:28:38 [2023-01-12 11:28:38 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:265)
2023-01-12 03:28:39 [2023-01-12 11:28:39 +0000] [79] [CRITICAL] WORKER TIMEOUT (pid:266)

...And the "worker timeout-booting worker-worker timeout" cycle continues forever. Now, if I comment out (remove) the redis, airflow-workflow, and airflow-triggerrer parts of the compose file as suggested by this article under the "Airflow Installation -- Lite Version" section of the article, everything runs fine and everything is healthy. But, I know that I'm going to need those containers in the future.

If I've maxed out my MacOS resources, what do you suggest I do?

(NOTE: This question on Stack Overflow mentions improving Docker Memory when running on desktop as the solution. However, as you can see by my screenshot and text above, I have already tried that and it did not work.)

Austin Wolff
  • 367
  • 3
  • 10
  • Can you run `docker logs `? Also, what are you trying to accomplish by running Airflow locally? – RNHTTR Jan 12 '23 at 16:29
  • Hello @RNHTTR, the logs are already there (the last code cell of the post) from running `docker logs `. I would like to run Airflow locally to learn how to use it before moving it to the cloud and potentially making an expensive mistake. – Austin Wolff Jan 12 '23 at 20:40
  • [This question on Stack Overflow](https://stackoverflow.com/questions/67637004/gunicorn-worker-terminated-with-signal-9/67719886#67719886) mentions improving Docker Memory when running on desktop as the solution. However, as you can see by my screenshot and text above, I have already tried that and it did not work. – Austin Wolff Jan 12 '23 at 20:45
  • 1
    The [docs suggest 8gb should be fine](https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html), so that's odd. If your goal is to just learn about Airflow, you might want to consider using the [astro cli](https://docs.astronomer.io/astro/cli/overview) in order to create an [astro project](https://docs.astronomer.io/astro/create-project) & get started with `astro dev start`. I have Docker configured to use 3 CPUs, 6GB memory, 1GB swap, 60 GB virtual disk and it runs smoothly. Free & open source & super easy to get started. Disclaimer: I work for Astronomer. – RNHTTR Jan 12 '23 at 21:13
  • For anyone wondering, this problem was never solved. I had to use a light-weight version of airflow with less functionality. I believe my MacOS does not have enough resources to run the full airflow docker-compose.yaml set up. – Austin Wolff Feb 22 '23 at 16:54

0 Answers0