3

I am new to airflow and I have written a simple SSHOperator to learn how it works.

default_args = {
'start_date': datetime(2018,6,20)
}

dag = DAG(dag_id='ssh_test', schedule_interval = '@hourly',default_args=default_args)

sshHook = SSHHook(ssh_conn_id='testing')

t1 = SSHOperator(
    task_id='task1',
    command='echo Hello World',
    ssh_hook=sshHook,
    dag=dag)

When I manually trigger it on the UI, the dag shows a status of running but the operator stays white, no status.

I'm wondering why my task isn't queuing. Does anyone have any ideas? My airflow.config is the default if that is useful information.

Even this isn't running

dag=DAG(dag_id='test',start_date = datetime(2018,6,21), schedule_interval='0 0 * * *')
runMe = DummyOperator(task_id = 'testest', dag = dag)
friendly1358
  • 67
  • 1
  • 5
  • 3
    Take a look here for starters: https://stackoverflow.com/questions/49021055/airflow-1-9-0-is-queuing-but-not-launching-tasks/49047832#49047832 – tobi6 Jun 21 '18 at 18:08

2 Answers2

3

Make sure you've started the Airflow Scheduler in addition to the Airflow Web Server: airflow scheduler

Mike
  • 2,514
  • 2
  • 23
  • 37
3
  • check if airflow scheduler is running
  • check if airflow webserver is running
  • check if all DAGs are set to On in the web UI
  • check if the DAGs have a start date which is in the past
  • check if the DAGs have a proper schedule (before the schedule date) which is shown in the web UI
  • check if the dag has the proper pool and queue.
DennisLi
  • 3,915
  • 6
  • 30
  • 66
  • "check if the DAGs have a proper schedule (before the schedule date) which is shown in the web UI" was the key for me -- had set the schedule for when it was first going to be run. – Mike Burger Dec 05 '22 at 15:48