0

I am aware of spark memory management. (Reserve, User, spark.storage and spark.execution). I am also aware of how to control these sizes. I have a very basic question, specifically about spark execution memory. Why does spark job fails with memory overflow if it can spill over both execution and storage memory?

Please don't take me wrong but be precise in answering, I need in depth understanding.

  • Post: https://stackoverflow.com/questions/55605506/if-spark-support-memory-spill-to-disk-how-can-spark-out-of-memory-happen does not answer my question. – Abhijeet Sachdev Dec 30 '22 at 03:50

0 Answers0