I am new to airflow. I was trying to schedule a job which uses bashoperator to run a spark submit command.It is working fine but it is keeping airflow busy till it completes the spark job.
cmd = "ssh hadoop@<ipaddress> spark-submit \
--master yarn \
--deploy-mode cluster \
--executor-memory 2g \
--executor-cores 2 \
/home/hadoop/main.py"
t = BashOperator(task_id='task1',bash_command=cmd,dag=dag)
How can i make airflow just submit the bash command and move on to another task?
I am currently running airflow on standalone EC2 machine.
Also how can we make airflow run multiple tasks at sametime.