I want to launch a Spark job from a Javaee Web application, but SparkLauncher started without exception and there is no job started on Spark cluster. Anybody can help?
public static void runJob(String userId) throws Exception {
long previous = System.currentTimeMillis();
logger.info("initialize spark context...");
init("spark-cluster-test.properties");
Process spark = new SparkLauncher()
.setSparkHome(spark_home)
.setMaster(spark_master)
.setAppName(app_name+timestamp)
.setAppResource(jar_file)
.setMainClass(main_class)
.setConf(SparkLauncher.DRIVER_MEMORY, spark_driver_memory)
.setConf("spark.network.timeout",spark_network_timeout)
.setConf(SparkLauncher.DRIVER_EXTRA_CLASSPATH, class_path)
.setConf(SparkLauncher.EXECUTOR_EXTRA_CLASSPATH, class_path)
.setDeployMode(spark_deploy_mode)
.addAppArgs(userId)
.launch();
spark.waitFor();
logger.info("spark job is returned after " + (System.currentTimeMillis() - previous) + " miliseconds.");
}
Web server's log:
2016-01-20 10:27:17 -67129627 [http-bio-8088-exec-127] INFO - SparkJobLauncher is started with userId=102
2016-01-20 10:27:17 -67129628 [http-bio-8088-exec-127] INFO - initialize spark context...
2016-01-20 10:27:19 -67130838 [http-bio-8088-exec-127] INFO - spark job is returned after 1 seconds.
2016-01-20 10:27:19 -67130839 [http-bio-8088-exec-127] DEBUG - Creating a new SqlSession
But no job is started on Spark cluster. And the spark.waitFor() method returned right after the invoke, which is supposed to run for several seconds.