According to [Spark on YARN resource manager: Relation between YARN Containers and Spark Executors, the number of yarn containers should be equal to the num-executors for a spark application. However, I did see in a run that num-executors shown in Spark-UI environment tab was 60 but the number of containers shown in yarn was only 37. I was using spark 2.2 and spark.dynamicAllocation.enabled is set to false. I used Azure-HDinsight cluster. Anyone can explain this?
Asked
Active
Viewed 257 times
2
-
It also dependence on your resource (cpu core and memory ) – howie Apr 03 '19 at 01:35
1 Answers
0
Spark-UI also shows some terminated executors. They may have been removed by Spark dynamic execution or through YARN preemption. You normally can tell if executors are still alive or not.
Another reason for them to be different is Spark driver. In ‘yarn-cluster’ mode driver occupies a yarn container too. So you’ll see +1 container difference in this case too.

Tagar
- 13,911
- 6
- 95
- 110