I launched my spark program in client mode. Set spark.executor.memory to 24g while creating SparkSession. However when I check in Spark UI logs, I see the spark.executor.memory to be 2g. Why the value is not getting set during SparkSession creation?
Asked
Active
Viewed 317 times
0
-
In which mode are you running? YARN or standalone? – lvnt Jul 29 '18 at 08:51
-
Hi Don, Can you paste the spark-submit command here? Are you running as standalone, Yarn or Mesos deployments? Do you have enough resources on your cluster? Which job are you running stream/batch using checkpoints? – Arnon Rodman Jul 29 '18 at 08:52
-
hi Arnon, Here is my command. spark – Don Sam Jul 29 '18 at 17:45
-
hi Arnon, Here is my command. spark-submit --class
--deploy-mode client --driver-memory 24g – Don Sam Jul 29 '18 at 17:51However later I figured that in client mode even executor-memory must be passed at run time with spark-submit. I am running it with master as YARN. Thanks. -
Some of the configuration parameters in the yarn-site.xml may cause this. You can find some detail about this situation in this answer: https://stackoverflow.com/questions/51511019/why-does-not-spark-executor-instances-work/51511340#51511340 – lvnt Jul 29 '18 at 20:46