0

I am running a spark job and even though I've set the the --num-executors parameter to 3 i can't see any executors in the in the web ui executors tab why is happening

  • I've used the command bin\spark-submit.cmd ^ --master local[8] ^ --num-executors 3 ^ --driver-memory 2G ^ --executor-memory 2G ^ --total-executor-cores 2 ^ --packages org.apache.spark:spark-sql-kafka-0-10_2.12:3.0.1 ^ --py-files C:\xyz.py^ C:\abc.py – Ayushi Dewangan Sep 17 '21 at 12:36

1 Answers1

1

Spark in local mode is non-distributed. Spark process will run on single JVM and driver will also behave as an executor.

You can only define number of threads in master URL.

You can switch to standalone mode. Start the master using below command:

spark-class org.apache.spark.deploy.master.Master

And the worker using:

spark-class org.apache.spark.deploy.worker.Worker spark://<host>:7077

Now run the spark-submit command. If you have 6 cores, just specifying --executor-cores 2 will create 3 executors and you can check the on spark UI.

Mohana B C
  • 5,021
  • 1
  • 9
  • 28