0

We have REST services running on http server and would want to submit spark code/statements concurrently to an existing running spark application.

How can we create a long running spark context to submit multiple queries?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
Mata
  • 439
  • 2
  • 10
  • Hi, just to clarify. If I understand well, you want to do a `spark-submit` once that will be alive infinitely and submit here the queries to execute ? Or you want literally reload the deployed JAR for an already running application? – Bartosz Konieczny Jun 28 '19 at 10:51
  • No, I am able to do spark-submit once using sparkLauncher but what i need is something like spark-jobserver, where we can submit a piece of spark code snippet and gets u the data...i am exploring that now. https://github.com/spark-jobserver/spark-jobserver – Mata Jul 01 '19 at 07:25
  • Thanks. I didn't use the project before so I'm wondering what's the difference with a notebook? I ask because on this https://docs.qubole.com/en/latest/admin-guide/spark-admin/spark-job-server.html#understanding-the-spark-job-server page they say that the "Qubole’s Spark Job Server is backed by Apache Zeppelin". And maybe using a notebook will be easier? – Bartosz Konieczny Jul 01 '19 at 09:43
  • You can use Apache-Livy or Spark JobServer for triggering Spark JOB . I have used Apache-Livy in my project. https://livy.apache.org/ Also you can refer to the thread :: https://stackoverflow.com/questions/28992802/triggering-spark-jobs-with-rest – dassum Jul 11 '19 at 07:03
  • Thanks @bartosz25 and @dassum! We are exploring Livy. – Mata Jul 15 '19 at 09:11

0 Answers0