4

We are using Google Dataflow for batch data processing and looking for some options for workflow orchestration tools something similar to what Azkaban does for Hadoop.

Key things things that we are looking for are,

  • Configuring workflows
  • Scheduling workflows
  • Monitoring and alerting failed workflows
  • Ability to rerun failed jobs

We have evaluated Pentaho, but these features are available in their Enterprise edition which is expensive. We are currently evaluating Azkaban as it supports javaprocess job types. But Azkaban is primarily created for Hadoop jobs so it has more deep integration with Hadoop infrastructure then plain javaprocesses.

Appreciate some suggestions for opensource or very low cost solutions.

Mayur Shah
  • 41
  • 3

2 Answers2

1

It sounds like Apache Airflow (https://github.com/apache/incubator-airflow) should meet your needs and it now has a Dataflow operator (https://github.com/apache/incubator-airflow/blob/master/airflow/contrib/operators/dataflow_operator.py).

0

To orchestrate the Google dataflow we can use Cloud composer which is managed workflow orchestration service built on Apache Airflow. It gives more flexibility, using this we can orchestrate most of the google services and workflows that cross between on-premises and the public cloud.

SANN3
  • 9,459
  • 6
  • 61
  • 97