9

I am running spark on my local windows machine. I am able to start spark shell successfully.

I want to edit the spark-env.sh file residing in conf/ folder. What is the right way to add values to the spark-env.sh file.

E.g If I want to add value to SPARK_EXECUTOR_MEMORY variable how to do it? Am getting confused between different answers that are available 1. SPARK_EXECUTOR_MEMORY="2G" 2. export

Anbarasu
  • 609
  • 1
  • 6
  • 11

2 Answers2

15

The spark-env.sh is a regular bash script intended for Unix, so on a Windows installation it will never get picked up.

On Windows, you'll need to have a spark-env.cmd file in the conf directory and instead use the following syntax :

set SPARK_EXECUTOR_MEMORY=2G

On Unix, the file will be called spark-env.sh and you will need to preprend each of your properties with export (e.g. : export SPARK_EXECUTOR_MEMORY=2G)

Jonathan Taws
  • 1,168
  • 11
  • 24
  • 1
    Thank you! Creating spark-env.cmd in conf directory and setting the values as `set SPARK_EXECUTOR_MEMORY=2G` worked. – Anbarasu Jul 11 '16 at 08:47
7

You must have to use export to add any configuration in *.sh file. So in spark-env.sh file use following example,

export SPARK_MASTER_IP=192.165.5.1
export SPARK_EXECUTOR_MEMORY=2g
#OR export SPARK_EXECUTOR_MEMORY=2G

No need to use double quotes for values.

Sheel
  • 1,010
  • 1
  • 17
  • 30