I have a working python script with runs from CronJob. I want to convert it to DAG with PythonOperator(s)
as we now are converting to Airflow.
Say that I have functions: a(),b(),c(),d()
And their execution order is : a->b->c->d
Lets say that the function codes are:
def a():
print("Happy")
def b():
print("Birthday")
def c():
print("to")
def d():
print("you!")
** This is just an example my code for all functions is more complex
I have this DAG:
args = {
'owner': 'airflow',
'start_date': airflow.utils.dates.days_ago(2),
'schedule_interval': '0 10 * * *'
}
dag = DAG(dag_id='example', default_args=args)
a = PythonOperator(task_id='a', dag=dag)
b = PythonOperator(task_id='b', dag=dag)
c = PythonOperator(task_id='c', dag=dag)
d = PythonOperator(task_id='d', dag=dag)
a.set_downstream(b)
b.set_downstream(c)
c.set_downstream(d)
What I don't understand is where do I put the codes of a(),b(),c(),d()
and where do I specify thier names in the execution of the PythonOperator.
You could say that I'm looking for a way to convert my Python script into Airflow as each function will be a separate operator.
I thought this should be very simple and basic but I didn't find any information about how to do that.