I am using SparkLauncher
in Spark v1.6.0. My problem is that when I use this class to launch my Spark jobs, it returns immediately and no job is submitted. My code is as follows.
new SparkLauncher()
.setAppName("test word count")
.setAppResource("file://c:/temp/my.jar")
.setMainClass("my.spark.app.Main")
.setMaster("spark://master:7077")
.startApplication(new SparkAppHandler.Listener() {
@Override public void stateChanged(SparkAppHandle h) { }
@Override public void infoChanged(SparkAppHandle h) { }
});
When I debug into the code, I notice, to my surprise, that all this clazz really does is calls a script spark-submit.cmd
using ProcessBuilder
.
[C:/tmp/spark-1.6.0-bin-hadoop2.6/bin/spark-submit.cmd, --master, spark://master:7077, --name, "test word count", --class, my.spark.appMain, C:/temp/my.jar]
However, if I run this command (the one that is run by ProcessBuilder
) directly on the console, a Spark job is submitted. Any ideas on what's going on?
There's another method SparkLauncher.launch()
that is available, but the javadocs say to avoid this method.
Any idea what's going on?