0

I am running docker desktop 3.3.1 with Linux container on OS window server 2019.

Airflow started on the docker desktop using the docker-compose file as mention here and it’s running successfully. But while starting the dag on this airflow after some time I am getting the following message on the airflow webserver:

“The scheduler does not appear to be running. Last heartbeat was received X minute ago. The DAGs list may not update, and new tasks will not be scheduled.”

Some time airflow webserver container exit with code 137, airflow webserver console getting error “critical worker timeout”

Regards

Shahid Iqbal
  • 11
  • 1
  • 2
  • Does this answer your question? [Airflow scheduler does not appear to be running after execute a task](https://stackoverflow.com/questions/57668584/airflow-scheduler-does-not-appear-to-be-running-after-execute-a-task) – Vinay Kulkarni Jul 10 '21 at 23:55

1 Answers1

0

in my case it was due to a bad configuration in the DAG parameters, try to test the following configuration and delete all the dag previously

from airflow.hooks.base_hook import BaseHook
import os
from datetime import datetime, timedelta

from airflow import DAG

from airflow.operators.bash_operator import BashOperator  
_dag_name = os.path.basename(__file__).replace('.py', '')
   
default_args = {
    "depends_on_past": False,
    "start_date": datetime(2021, 4, 15),
    'retries': 3,
    'retry_delay': timedelta(minutes=5),
    'on_failure_callback': task_fail_slack_alert,
    'on_success_callback': task_success_slack_alert,
    'is_paused_upon_creation': False,
}

with DAG(
    dag_id=_dag_name,
    default_args=default_args,
    description='Print  ',
    catchup=False,
    schedule_interval=None,
) as dag:

    bo1 = BashOperator(
        run_as_user='airflow',
        task_id='pip-list-ls-lh',
        # bash_command='ls -lh'
        bash_command="""python3 -m pip list  """
    )
 

    bo1