I used airflow for workflow of Spark jobs. After installation, I copy the DAG files into DAGs folder set in airflow.cfg. I can backfill the DAG to run the BashOperators successfully. But there is always a warning like the one mentioned. I didn't verify if the scheduling is fine, but I doubt scheduling can work as the warning said the master scheduler doesn't know of my DAG's existence. How can I eliminate this warning and get scheduling work? Anybody run into the same issue who can help me out?
Asked
Active
Viewed 1.0k times
1 Answers
22
This is usually connected to the Scheduler not running or the refresh interval being too wide. There are no log entries present so we cannot analyze from there. Also, unfortunately the very cause might have been ignored, because this is usually the root of the problem:
I didn't verify if the scheduling is fine.
So first you should check if both of the following services are running:
airflow webserver
and
airflow scheduler
If that won't help, see this post for more reference: Airflow 1.9.0 is queuing but not launching tasks

tobi6
- 8,033
- 6
- 26
- 41
-
2thanks your comments and appreciated tobi, after scheduler is started, the warning disappear, – bronzels Apr 23 '18 at 11:50
-
3In addition to answer above, i.e checking if ```airflow webserver``` and ```airflow scheduler``` are running, make sure that the environment variable `AIRFLOW_HOME` is set properly in the environments where you are running the airflow webserver and schedular. I was simply testing airflow for first time and in one terminal i had set the environment variable from where i was running webserver while the terminal where i was running schedular, i forgot to set it there. Hence i had the same error – Hassan Kamal Sep 05 '18 at 16:04
-
1I had the same issue, I stopped webserver and scheduler and then started again, though it was showing the same message. Then I restarted server and message gone. – JPatel Nov 27 '18 at 11:44
-