I'm trying to submit a spark 2.3 job on kubernetes cluster in scala using the play framework.
I have also tried as a simple scala program without using play framework.
The job is getting submitted to k8 cluster but stateChanged & infoChanged are not getting invoked. I also want to be able to get the handle.getAppId.
I'm using spark submit to submit the job, as described here
$ bin/spark-submit \
--master k8s://https://<k8s-apiserver-host>:<k8s-apiserver-port> \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.executor.instances=5 \
--conf spark.kubernetes.container.image=<spark-image> \
local:///path/to/examples.jar
Here is the code for the job:
def index = Action {
try {
val spark = new SparkLauncher()
.setMaster("my k8 apiserver host")
.setVerbose(true)
.addSparkArg("--verbose")
.setMainClass("myClass")
.setAppResource("hdfs://server/inputs/my.jar")
.setConf("spark.app.name","myapp")
.setConf("spark.executor.instances","5")
.setConf("spark.kubernetes.container.image","mydockerimage")
.setDeployMode("cluster")
.startApplication(new SparkAppHandle.Listener(){
def infoChanged(handle: SparkAppHandle): Unit = {
System.out.println("Spark App Id ["
+ handle.getAppId
+ "] Info Changed. State ["
+ handle.getState + "]")
}
def stateChanged(handle: SparkAppHandle): Unit = {
System.out.println("Spark App Id ["
+ handle.getAppId
+ "] State Changed. State ["
+ handle.getState + "]")
if (handle.getState.toString == "FINISHED") System.exit(0)
}
} )
Ok(spark.getState().toString())
} catch {
case NonFatal(e)=>{
println("failed with exception: " + e)
}
}
Ok
}