4

According to this documentation a Spark app, which was started/submitted with the SparkLauncher and the method startApplication, can be killed with the returned SparkAppHandle and the kill() method, as it's a child-process. I tried to implement this in combination with a CountDownLatch and a timer, but it doesn't work for me. The Java-app with the SparkLauncher is finishing after 20min, but the spark app is still running on my YARN-cluster afterwards.

I'm using the following code:

`
// launcher config...
CountDownLatch countDownLatch;
Listener handleListeners = new Listener() {
    @Override
    public void stateChanged(SparkAppHandle handle) {
        if (handle.getState().isFinal()) {
            countDownLatch.countDown();
        }

        @Override
        public void infoChanged(SparkAppHandle handle) {}
    };

countDownLatch = new CountDownLatch(1);
SparkAppHandle handle = launcher.startApplication(handleListeners);
boolean regularExit = countDownLatch.await(20, TimeUnit.MINUTES);
if (!regularExit)
    handle.kill();                                                                          

` I'm still wondering that the kill-command shall work when it's already running on the cluster.

(Please, can somebody fix the code-snippet? I don't get it properly formatted -_- Thanks.)

MUmla
  • 445
  • 1
  • 8
  • 26
  • Did you find a solution for this? @Mumla – Rohit Nimmala Jan 22 '20 at 06:31
  • 1
    Unfortunately not. I assume that this is (/was?) a bug. I also just noticed that the documentation doesn't contain the information about this kill-method anymore since version 2.3.0. – MUmla Jan 22 '20 at 12:52

0 Answers0