I am running spark on my local windows machine. I am able to start spark shell successfully.
I want to edit the spark-env.sh file residing in conf/ folder. What is the right way to add values to the spark-env.sh file.
E.g If I want to add value to SPARK_EXECUTOR_MEMORY variable how to do it? Am getting confused between different answers that are available 1. SPARK_EXECUTOR_MEMORY="2G" 2. export