I start:
- master using
spark-class org.apache.spark.deploy.master.Master
- worker using
spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
And then I submit the jar using
spark-submit \
--class pack1.Maintest \
--master spark://IP:PORT \
--deploy-mode cluster \
Program_1.jar
PROBLEM:
Using these commands I can run a single application, but when i try to run another application like so:
spark-submit \
--class pack0.test2 \
--master spark://IP:PORT \
--deploy-mode cluster \
Program_2.jar
and assigning another worker to this jar by - spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
, then web page shows this 2nd jar is waiting.
I tried this using detach:
spark-submit \
--class <class-name-1> \
--master <master-url> \
--deploy-mode cluster \
--name job-1 <jar-name-1> \
--detach
or like this:
spark-submit
--class my.package.Main \
--master yarn \
--deploy-mode cluster \
--driver-memory 2g \
--executor-memory 4g \
--num-executors 10 \
my-application.jar arg1 arg2
I just want to run 2 - 3 jar files together without the other waiting, in cluster mode using spark submit.