I have many spark 1.6 applications which I want to run one after another (as they read/write from the same hive tables) using YARN
I tried specifying a common queue using spark-submit --queue QUENAME ...
but the applications still run in parallel.
is there another way to ensure only 1 application runs at the same time (other than using a loop e.g. a bash-script?)