2

I am running spark 1.6.0 on a small computing cluster and wish to kill a driver program. I've submitted a custom implementation of the out of the box Spark Pi calculation example with the following options:

spark-submit --class JavaSparkPi --master spark://clusterIP:portNum --deploy-mode cluster /path/to/jarfile/JavaSparkPi.jar 10

Note: 10 is a command line argument and is irrelevant for this question.

I've tried many methods of killing the driver program that was started on the cluster:

  1. ./bin/spark-class org.apache.spark.deploy.Client kill
  2. spark-submit --master spark://node-1:6066 --kill $driverid
  3. Issue the kill command from the spark administrative interface (web ui): http://my-cluster-url:8080

Number 2 yields a success JSON response:

{
  "action" : "KillSubmissionResponse",
  "message" : "Kill request for driver-xxxxxxxxxxxxxx-xxxx submitted",
  "serverSparkVersion" : "1.6.0",
  "submissionId" : "driver-xxxxxxxxxxxxxx-xxxx",
  "success" : true
}

Where 'driver-xxxxxxxxxxxxxx-xxxx' is the actual driver id.

But the web UI http://my-cluster-url:8080/ still shows the driver program as running.

Is there anything else I can try?

Community
  • 1
  • 1
miniscem
  • 327
  • 1
  • 5
  • 14

0 Answers0