0
spark = SparkSession.builder.getOrCreate()

spark.sparkContext.getConf().get('spark.executor.instances') 
# Result: None

spark.conf.get('spark.executor.instances') 
# Result: java.util.NoSuchElementException: spark.executor.instances

I would like to see default value of the number of executors.

I looked into the ways to get the value and none of them worked.

How to get the number of workers(executors) in PySpark?

tomerpacific
  • 4,704
  • 13
  • 34
  • 52
alryosha
  • 641
  • 1
  • 8
  • 15
  • Have you tried `sc._jsc.sc().getExecutorMemoryStatus()`? Just make sure not to declare that in the beginning of the file (or at least put a sleep command if you do so) – vilalabinot Dec 21 '22 at 09:20

1 Answers1

0

You can see specified configurations in Environment tab of application web UI or get all specified parameters with following line:

spark.sparkContext.getConf().getAll() 

According to spark documentation

only values explicitly specified through spark-defaults.conf, SparkConf, or the command line will appear. For all other configuration properties, you can assume the default value is used.

So if you did not assign a value to spark.executor.instances then you should check its default value on Running Spark on Yarn

ozlemg
  • 436
  • 2
  • 10