How do I configure my JVM for Spark on Yarn?
I want to increase the maxiumum and minimum heap space using -Xmx and -Xms etc. However, I do not know how to use the commands or which program running java to apply it to because spark looks like it has multiple programs running on java. See the image for more information.
First, I ssh to my cluster. Second, I start an Ipython notebook. Third, I start spark.
Starting Spark Code
ssh -i ~/.ssh/huddle-hadoop hadoop@ec2-54-83-79-162.compute-1.amazonaws.com
export IPYTHON_OPTS="notebook"
~/spark/bin/pyspark --master yarn-client --num-executors 200 --executor-memory 4g
Details
java version "1.7.0_71" Java(TM) SE Runtime Environment (build 1.7.0_71-b14) Java HotSpot(TM) 64-Bit Server VM (build 24.71-b01, mixed mode)