1

I am using gcloud.

With Composer I have an airflow environment that schedules diffferent task pipelines.

One of those pipelines consist in:

  1. few tasks to prepare for the dataflow job
  2. The Dataflow job
  3. Some tasks that will exploit the results of the jobs

The DAG is configured for each step to wait for the former ones to have succeeded.

I have an issue, the dataflow task is marked as failed in airflow, so the next steps are not executed. However when I check the job in dataflow it seems to have succeeded.

this is all the (cut) log i have in airflow: enter image description here

Is this a known issue ?

Louis D.
  • 319
  • 1
  • 11

0 Answers0