I have a very long Spark job, from which a small number of tasks are currently stalled. Is there any way to kill those stalled tasks from the driver node?
For permission reasons I can log in, but not kill the jobs on the slave nodes, so I'm looking for a way to do this from the driver node alone. Note, that I don't want to kill the entire Spark job - just one or two stalled tasks.
If it helps, I'm using Mesos and have access to web UI, but that does not contain the option to kill the task.