This diagram is quite clear on the relationship between the different YARN and Spark memory related settings, except when it comes to spark.python.worker.memory
.
How does spark.python.worker.memory
fit into this memory model?
Are the Python processes governed by spark.executor.memory
or yarn.nodemanager.resource.memory-mb
?
Update
This question explains what the setting does, but doesn't answer the question concerning the memory governance, or how it relates to other memory settings.