4

I have set spark.executor.memory to 2048m, and in the UI "Environment" page, I can see this value has been set correctly. But in the "Executors" page, I saw there's only 1 executor and its memory is 265.4MB. Very strange value. why not 256MB, or just as what I set?

What am I missing here?

David S.
  • 10,578
  • 12
  • 62
  • 104

2 Answers2

6

The "Executors" tab on the UI also includes the driver in the list. Its "executor ID" is listed as <driver>. This process is not started by Spark, so it is not affected by spark.executor.memory.

  • If you start the driver with spark-submit, its maximal memory can be controlled by spark.driver.memory or --driver-memory
  • If you start it as a plain old Java program, use the usual -Xmx Java flag.
David S.
  • 10,578
  • 12
  • 62
  • 104
Daniel Darabos
  • 26,991
  • 10
  • 102
  • 114
  • Good answer. Though, are you sure the driver process is not started by Spark? If not, who or what starts it? – Mikel Urkia Mar 16 '15 at 14:32
  • The driver is your application. You can start it with the plain `java` command. Spark provides the `spark-submit` tool which can also be used to start your application. – Daniel Darabos Mar 16 '15 at 14:40
  • That's what I thought. Since I usually use `spark-submit` to launch *Spark* jobs, your affirmation caught me by surprise. Thanks for the aclaration. – Mikel Urkia Mar 16 '15 at 16:19
  • Thanks. I've rephrased the last sentence to make it more clear which flags/settings are for which method of starting it. – Daniel Darabos Mar 16 '15 at 22:21
  • @DanielDarabos, thanks a lot. I did not notice the 2nd part of your answer and had a hard time setting the value in IDE. `-Xmx` saved my day. – David S. Mar 21 '15 at 23:53
2

Please see the following question for the 265.4MB memory size...

How to set Apache Spark Executor memory

Community
  • 1
  • 1
ᐅdevrimbaris
  • 718
  • 9
  • 20