I am wondering if it is possible to submit, monitor & kill spark applications from another service.
My requirements are as follows:
I wrote a service that
- parse user commands
- translate them into understandable arguments to an already prepared Spark-SQL application
- submit the application along with arguments to Spark Cluster using
spark-submit
fromProcessBuilder
- And plans to run generated applications' driver in cluster mode.
Other requirements needs:
- Query about the applications status, for example, the percentage remains
- Kill queries accrodingly
What I find in spark standalone documentation suggest kill application using:
./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
And should find the driver ID through the standalone Master web UI at http://<master url>:8080.
So, what am I supposed to do?
Related SO questions:
Spark application finished callback
Deploy Apache Spark application from another application in Java, best practice