I am using gcloud.
With Composer I have an airflow environment that schedules diffferent task pipelines.
One of those pipelines consist in:
- few tasks to prepare for the dataflow job
- The Dataflow job
- Some tasks that will exploit the results of the jobs
The DAG is configured for each step to wait for the former ones to have succeeded.
I have an issue, the dataflow task is marked as failed in airflow, so the next steps are not executed. However when I check the job in dataflow it seems to have succeeded.
this is all the (cut) log i have in airflow:
Is this a known issue ?