6

I'm writing a spark application and run it using spark-submit shell script (using yarn-cluster/yarn-client)

As I see now, exit code of spark-submit is decided according to the related yarn application - if SUCCEEDED status is 0, otherwise 1.

I want to have the option to return another exit code - for a state that my application succeeded with some errors.

Is it possible? to return different exit code from the application?

I tried to use System.exit() but didn't succeed...

Thanks.

roh
  • 123
  • 1
  • 1
  • 10

2 Answers2

0

It is possible in client mode but not in cluster mode. You have a workaround for cluster mode.

My answer to this question should help you.

Community
  • 1
  • 1
code
  • 2,283
  • 2
  • 19
  • 27
  • Read that and still don't understand how using yarn-client will help in that case? still we'l get the yarn status.. – roh Feb 04 '17 at 14:39
0

If you run in cluster mode, spark-submit ends immediately returning submission ID as part of json, and do not waits for the application status. After this you can query the status by

 spark-submit --status [submission ID] 

If run in local or standalone modes you should be able to get this the exit code from spark-submit process.