0

I have an EMR Cluster that has correctly spawned 6 executors, 4 cores each. When the spark job is run on the cluster, it creates 6 Containers, which are each only assigned 1 core, How do i specify the number of cores each Container is allocated?

Relevant config:

spark.executor.instances: 5
spark.executor.cores: 4
yarn.scheduler.minimum.allocation.vcores: 4
scalaLala
  • 53
  • 1
  • 7

1 Answers1

0

This seems to be simply a reporting error with the Yarn UI as discussed here Dataproc set number of vcores per executor container

Spark UI shows the correct number of cores in the executors, and the log files from the application state that it has been assigned the correct number of cores.

scalaLala
  • 53
  • 1
  • 7