0

I start:

  • master using spark-class org.apache.spark.deploy.master.Master
  • worker using spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT

And then I submit the jar using

spark-submit \
    --class pack1.Maintest \
    --master spark://IP:PORT \
    --deploy-mode cluster \
    Program_1.jar

PROBLEM:

Using these commands I can run a single application, but when i try to run another application like so:

spark-submit \
    --class pack0.test2 \
    --master spark://IP:PORT \
    --deploy-mode cluster \
    Program_2.jar

and assigning another worker to this jar by - spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT, then web page shows this 2nd jar is waiting.

I tried this using detach:

spark-submit \
    --class <class-name-1> \
    --master <master-url> \
    --deploy-mode cluster \
    --name job-1 <jar-name-1> \
    --detach

or like this:

spark-submit 
    --class my.package.Main \
    --master yarn \
    --deploy-mode cluster \
    --driver-memory 2g \
    --executor-memory 4g \
    --num-executors 10 \
    my-application.jar arg1 arg2

I just want to run 2 - 3 jar files together without the other waiting, in cluster mode using spark submit.

Koedlt
  • 4,286
  • 8
  • 15
  • 33
zainab
  • 39
  • 4
  • 1
    Does this answer your question? [How do I run multiple spark applications in parallel in standalone master](https://stackoverflow.com/questions/43516948/how-do-i-run-multiple-spark-applications-in-parallel-in-standalone-master) – Koedlt Apr 02 '23 at 16:37
  • When a spark app is waiting is usually because there are no resources, so the first app already consuming all memory/cpu, try to define the amount of memory/cpu for each spark-submit to leave more resources for other apps. – Abdennacer Lachiheb Apr 02 '23 at 20:38
  • 1
    thank you @koedlt for sharing this link. And yes it does answer my question. – zainab Apr 14 '23 at 04:47

0 Answers0