Questions tagged [spark-launcher]

34 questions
17
votes
3 answers

Spark Launcher waiting for job completion infinitely

I am trying to submit a JAR with Spark job into the YARN cluster from Java code. I am using SparkLauncher to submit SparkPi example: Process spark = new SparkLauncher() …
TomaszGuzialek
  • 861
  • 1
  • 8
  • 15
9
votes
3 answers

Why does SparkLauncher return immediately and spawn no job?

I am using SparkLauncher in Spark v1.6.0. My problem is that when I use this class to launch my Spark jobs, it returns immediately and no job is submitted. My code is as follows. new SparkLauncher() .setAppName("test word count") …
Jane Wayne
  • 8,205
  • 17
  • 75
  • 120
8
votes
2 answers

How to properly wait for apache spark launcher job during launching it from another application?

I am trying to avoid "while(true)" solution when i waiting until my spark apache job is done, but without success. I have spark application which suppose to process some data and put a result to database, i do call it from my spring service and…
Alex Aniska
  • 81
  • 1
  • 3
4
votes
0 answers

Spark launcher handle not updating state on Standalone cluster mode

I'm trying to programmatically submit Spark jobs using the Spark Launcher library in a spring web application. Everything works fine with yarn-client, yarn-cluster and standalone-client modes. However, when using standalone-cluster mode, the…
4
votes
2 answers

Pass parameters to the jar when using spark launcher

I am trying to create an executable jar which is using a spark launcher to run another jar with data transformation task(this jar creates spark session). I need to pass java parameters(some java arrays) to the jar which is executed by the…
Oleg Yarin
  • 161
  • 2
  • 12
4
votes
0 answers

Can't kill Spark app with SparkLauncher and SparkAppHandle

According to this documentation a Spark app, which was started/submitted with the SparkLauncher and the method startApplication, can be killed with the returned SparkAppHandle and the kill() method, as it's a child-process. I tried to implement…
MUmla
  • 445
  • 1
  • 8
  • 26
4
votes
0 answers

Receiving result from spark job launched using SparkLauncher

I am launching a spark job using following code: public static void main(String[] args) throws InterruptedException, ExecutionException { Process sparkProcess; try { sparkProcess = new SparkLauncher() …
vatsal mevada
  • 5,148
  • 7
  • 39
  • 68
4
votes
0 answers

Web application failed to launch Spark job using SparkLauncher

I want to launch a Spark job from a Javaee Web application, but SparkLauncher started without exception and there is no job started on Spark cluster. Anybody can help? public static void runJob(String userId) throws Exception { long previous =…
victorming888
  • 121
  • 2
  • 5
3
votes
0 answers

SparkAppHandle states not getting updated in Kubernetes

While launching Spark application through SparkLauncher() , SparkAppHandle state is not getting updated. sparkLaunch = new…
3
votes
0 answers

SparkLauncher launching only one application even when submitting multiple applications

I am submitting/running multiple application through spark launcher in my java web app. it seems to submit only one app.Here is my code Runnable run = new Runnable() { public void run() { try { SparkAppHandle…
AngryLeo
  • 390
  • 4
  • 23
3
votes
1 answer

Difference between running spark application as standalone vs spark submit / spark launcher?

I am exploring different options to package spark application and i am confused what is the best mode and what are the differences between the following modes? Submit spark application's jar to spark-submit Construct a fat jar out of spark gradle…
Mozhi
  • 757
  • 1
  • 11
  • 28
3
votes
1 answer

Spark launcher handle and listener not giving state

I have a web application which will submit spark jobs on Cloudera spark cluster using spark launcher library. It is successfully submitting the spark job to cluster. However it is not calling back the listener class methods and also the getState()…
Reddy
  • 8,737
  • 11
  • 55
  • 73
3
votes
1 answer

How to set driver java options in SparkLauncher

When using spark-submit to submit a Spark app to Yarn, I can pass java options to the driver via the --driver-java-options, for example: spark-submit --driver-java-options "-Dlog4j.configuration=file:///conf/log4j.properties" ... How do I achieve…
mitchus
  • 4,677
  • 3
  • 35
  • 70
2
votes
1 answer

SparkAppHandle gives unknown state forever

I am launching a spark Job from Java application using SparkLauncher. SparkAppHandle jobHandle; try { jobHandle = new SparkLauncher() .setSparkHome("C:\\spark-2.0.0-bin-hadoop2.7") …
vatsal mevada
  • 5,148
  • 7
  • 39
  • 68
1
vote
1 answer

What is --archives for SparkLauncher in Java?

I am going to submit pyspark task, and submit an environment with the task. I need --archives to submit the zip package contain full environment. The working spark submit command is this /my/spark/home/spark-submit --master yarn --deploy-mode…
Haha TTpro
  • 5,137
  • 6
  • 45
  • 71
1
2 3