2

I have the dag config like below

    args = {
        'owner': 'XXX',
        'depends_on_past': False,
        'start_date': datetime(2018, 2, 26),
        'email': ['sample@sample.com'],
        'email_on_failure': False,
        'retries': 1,
        'retry_delay': timedelta(minutes=5)
    }

   dag = DAG(dag_id='Daily_Report',
      default_args=args,
      schedule_interval='0 11 * * *',
      dagrun_timeout=timedelta(seconds=30))

I have a bash operator and a data bricks operator

   run_this = BashOperator(task_id='run_report',
               bash_command=templated_command,
               dag=dag)

  notebook_run = DatabricksSubmitRunOperator(
         task_id='notebook_run',
         notebook_task=notebook_task,
         existing_cluster_id='xxxx',
         dag=dag)

I'm running this like run_this.set_downstream(notebook_run)

The bash operator runs fine but the data bricks operator doesn't run it just leaves a blank state like below

Blank Status Airflow

enter image description here

Any thing I'm missing ? Im using Airflow version from Databricks here https://github.com/databricks/incubator-airflow

L Y E S - C H I O U K H
  • 4,765
  • 8
  • 40
  • 57
Ongole
  • 21
  • 4
  • Have you checked if all components are up and running? See https://stackoverflow.com/questions/49021055/airflow-1-9-0-is-queuing-but-not-launching-tasks/ – tobi6 Mar 02 '18 at 09:17
  • Yes. All the components are up and running. If I trigger the dag manually it completes both the jobs. – Ongole Mar 02 '18 at 14:12

1 Answers1

2

Try highlighting the text in the white label. It will probably say "None". White on white is terrible UX so I'm not sure why Airflow does it that way.

enter image description here

c0bra
  • 2,982
  • 23
  • 34