I'm running the standard docker-compose file locally on my computer with all defaults from airflow with airflow:2.1.4, postgres:13, redis:latest. Everything works as expected when I have one instance of the scheduler, but when I add another instance of the scheduler, I start getting locking issues.
postgres_1 | STATEMENT: SELECT slot_pool.pool AS slot_pool_pool, slot_pool.slots AS slot_pool_slots FROM slot_pool FOR UPDATE NOWAIT
My relevant docker-compose file is
&airflow-common
environment:
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__WEBSERVER__WEB_SERVER_MASTER_TIMEOUT: 360
AIRFLOW__WEBSERVER__EXPOSE_CONFIG: 'true'
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
AIRFLOW__CORE__STORE_DAG_CODE: 'false'
airflow-scheduler-1:
<<: *airflow-common
command: scheduler
container_name: airflow-scheduler-1
airflow-scheduler-2:
<<: *airflow-common
command: scheduler
container_name: airflow-scheduler-2
The documentation is of no use since it mentions that I can just run "airflow scheduler" multiple times and it should work out of the box. Is there some sort of HA setting that I'm missing out on?