-1

I have shell script which initializing the spark streaming job in Yarn cluster mode.I have scheduled shell script through Autosys. Now when i kill autosys job i would like to kill this spark job running in cluster mode as well.
I have tried using yarn application -kill in shell script on error return code but its not getting executed.
but
I am able to kill this job from another shell window.
yarn application -kill command works perfectly there and kill the application.

Is there any workaround to kill cluster mode job on interruption (automatically ) through same shell ?

Elvish_Blade
  • 1,220
  • 3
  • 12
  • 13
  • Possible duplicate of [How do I stop a spark streaming job?](https://stackoverflow.com/questions/32582730/how-do-i-stop-a-spark-streaming-job) – pushpavanthar Jan 31 '19 at 13:12

1 Answers1

0

In error return code logic -> run yarn application -kill <$appid> as orphan process .

Elvish_Blade
  • 1,220
  • 3
  • 12
  • 13