1

I have following settings in my Spark job:

--num-executors 2 
--executor-cores 1 
--executor-memory 12G 
--driver memory 16G 
--conf spark.streaming.dynamicAllocation.enabled=false \
--conf spark.dynamicAllocation.enabled=false \
--conf spark.streaming.receiver.writeAheadLog.enable=false 
--conf spark.executor.memoryOverhead=8192 
--conf spark.driver.memoryOverhead=8192'

My understanding is job should run with 2 executors however it is running with 3. This is happening to multiple of my jobs. Could someone please explain the reason?

0 Answers0