2

How to configure the Executor's memory in the Spark cluster. Also, How to configure number of executors per worker node ?

Is there any way to know how much executor's memory is free to cache or persist new RDD's.

1 Answers1

1

Configuring Spark executor memory - use the parameter spark.executor.memory or key --executor-memory when submitting the job

Configuring number of executors per node depends on which scheduler you use for Spark. In case of YARN and Mesos you don't have a control over this, you can just set the number of executors. In case of Spark Standalone cluster, you can tune SPARK_WORKER_INSTANCES parameter

You can check the amount of free memory in WebUI of the Spark driver. Refer here How to set Apache Spark Executor memory to see why this is not equal to the total executor memory you've set

Community
  • 1
  • 1
0x0FFF
  • 4,948
  • 3
  • 20
  • 26
  • Does this mean that in standalone, you cannot have more than 1 executors per worker, so you have to create multiple worker instances? – vefthym Mar 08 '17 at 10:27