1

I am using Airflow 1.8.1. I have a DAG that I believe I have scheduled to run every 5 minutes, but it isn't doing so: enter image description here

Ignore the 2 successful DAG runs, those were manually triggered.

I look at the scheduler log for that DAG and I see:

[2019-04-26 22:03:35,601] {jobs.py:343} DagFileProcessor839 INFO - Started process (PID=5653) to work on /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:03:35,606] {jobs.py:1525} DagFileProcessor839 INFO - Processing file /usr/local/airflow/dags/retrieve_airflow_artifacts.py for tasks to queue
[2019-04-26 22:03:35,607] {models.py:168} DagFileProcessor839 INFO - Filling up the DagBag from /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:03:36,083] {jobs.py:1539} DagFileProcessor839 INFO - DAG(s) ['retrieve_airflow_artifacts'] retrieved from /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:03:36,112] {jobs.py:1172} DagFileProcessor839 INFO - Processing retrieve_airflow_artifacts
[2019-04-26 22:03:36,126] {jobs.py:566} DagFileProcessor839 INFO - Skipping SLA check for <DAG: retrieve_airflow_artifacts> because no tasks in DAG have SLAs
[2019-04-26 22:03:36,132] {models.py:323} DagFileProcessor839 INFO - Finding 'running' jobs without a recent heartbeat
[2019-04-26 22:03:36,132] {models.py:329} DagFileProcessor839 INFO - Failing jobs without heartbeat after 2019-04-26 21:58:36.132768
[2019-04-26 22:03:36,139] {jobs.py:351} DagFileProcessor839 INFO - Processing /usr/local/airflow/dags/retrieve_airflow_artifacts.py took 0.539 seconds
[2019-04-26 22:04:06,776] {jobs.py:343} DagFileProcessor845 INFO - Started process (PID=5678) to work on /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:04:06,780] {jobs.py:1525} DagFileProcessor845 INFO - Processing file /usr/local/airflow/dags/retrieve_airflow_artifacts.py for tasks to queue
[2019-04-26 22:04:06,780] {models.py:168} DagFileProcessor845 INFO - Filling up the DagBag from /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:04:07,258] {jobs.py:1539} DagFileProcessor845 INFO - DAG(s) ['retrieve_airflow_artifacts'] retrieved from /usr/local/airflow/dags/retrieve_airflow_artifacts.py
[2019-04-26 22:04:07,287] {jobs.py:1172} DagFileProcessor845 INFO - Processing retrieve_airflow_artifacts
[2019-04-26 22:04:07,301] {jobs.py:566} DagFileProcessor845 INFO - Skipping SLA check for <DAG: retrieve_airflow_artifacts> because no tasks in DAG have SLAs
[2019-04-26 22:04:07,307] {models.py:323} DagFileProcessor845 INFO - Finding 'running' jobs without a recent heartbeat
[2019-04-26 22:04:07,307] {models.py:329} DagFileProcessor845 INFO - Failing jobs without heartbeat after 2019-04-26 21:59:07.307607
[2019-04-26 22:04:07,314] {jobs.py:351} DagFileProcessor845 INFO - Processing /usr/local/airflow/dags/retrieve_airflow_artifacts.py took 0.538 seconds

over and over again. I've compared that to a DAG on another server and from doing so I know that there would be extra log records indicating that the DAG had been triggered via a schedule, there are no such records in this log file.

Here's how the schedule of my DAG is defined:

args = {
    'owner': 'airflow',
    'start_date': (datetime.datetime.now() - datetime.timedelta(minutes=5))
}

dag = DAG(
    dag_id='retrieve_airflow_artifacts', default_args=args,
    schedule_interval="0,5,10,15,20,25,30,35,40,45,50,55 * * * *")

Could someone help me to figure out why my DAG isn't running because I've looked high and low and cannot figure it out.

jamiet
  • 10,501
  • 14
  • 80
  • 159

1 Answers1

4

If I had to guess, I would say your start_date is causing you some issues.

Change your args to have a static start and prevent it from running on past intervals:

args = {
    'owner': 'airflow',
    'depends_on_past': False,
    'start_date': datetime(2019, 4, 27) #year month day
}

Also, just to make it easier to read, change your DAG args to (same functionality):

dag = DAG(
    dag_id='retrieve_airflow_artifacts', 
    default_args=args,
    schedule_interval="*/5 * * * *"
)

That should allow the scheduler to pick it up!

It's generally recommended not to set your start_date dynamically.

Taken from Airflow FAQ:

We recommend against using dynamic values as start_date, especially datetime.now() as it can be quite confusing. The task is triggered once the period closes, and in theory an @hourly DAG would never get to an hour after now as now() moves along.

Another SO question on this: why dynamic start dates cause issues

Zack
  • 2,296
  • 20
  • 28