I am getting the below error in Spark 1.5 :
Diagnostics: Container [pid=19554,containerID=container_e94_1518800506024_42837_02_000017] is running beyond physical memory limits. Current usage: 3.5 GB of 3.5 GB physical memory used; 4.3 GB of 7.3 GB virtual memory used. Killing container. Dump of the process-tree for container_e94_1518800506024_42837_02_000017
MASTER_URL=yarn-cluster
NUM_EXECUTORS=10
EXECUTOR_MEMORY=4G
EXECUTOR_CORES=6
DRIVER_MEMORY=3G
The data the application reads is 7MB of avro file, but there are multiple writes in spark application.
Is there any problem with Job configuration ?