-2

I have created a Spark cluster of 8 machines. Each machine have 104 GB of RAM and 16 virtual cores.

I seems that Spark only sees 42 GB of RAM per machine which is not correct. Do you know why Spark does not see all the RAM of the machines?

enter image description here

PS : I am using Apache Spark 1.2

zero323
  • 322,348
  • 103
  • 959
  • 935
poiuytrez
  • 21,330
  • 35
  • 113
  • 172

2 Answers2

2

Seems like a common misconception. What is displayed is the spark.storage.memoryFraction : https://stackoverflow.com/a/28363743/4278362

Community
  • 1
  • 1
Sietse
  • 201
  • 1
  • 4
0

Spark makes no attempt at guessing the available memory. Executors use as much memory as you specify with the spark.executor.memory setting. Looks like it's set to 42 GB.

Daniel Darabos
  • 26,991
  • 10
  • 102
  • 114