I run a scrip on spark/scala to collect a calculated results. The result data set is not too large however when I run the following:
Result.collect()
I get the following error:
#java.lang.OutOfMemoryError: Java heap space
# -XX:OnOutOfMemoryError="kill -9 %p"
# Executing /bin/sh -c "kill -9 10466"...
/usr/lib/spark/bin/spark-shell: line 41: 10466 Killed " $FWDIR"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"
I am not sure why I got this error for running scala I used the following command to increase my memory:
spark-shell driver-memory 8G --executor-memory 8G --executor-cores 4 --num-executors 5
Could you please help on this?
Thanks, Amir