7

I'm trying to understand how Spark 2.1.0 allocates memory on nodes.

Suppose I'm starting a local PySpark REPL assigning it 2GB of memory:

$ pyspark --conf spark.driver.memory=2g

Spark UI tells that there are 956.6 MB allocated for storage memory:

enter image description here

I don't understand how to get to that number, this is my thinking process:

  1. Driver heap size is set to 2048 MB,
  2. According to docs: (2048 MB - 300 MB) * 0.6 = 1048.8 MB are used for both execution and storage regions (unified),
  3. Additionally 1048.8 MB * 0.5 = 524.4 MB within unified region should be reserved as immune storage region

So, how was the value 956.6 MB in Spark actually calculated?

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
Khozzy
  • 1,064
  • 4
  • 15
  • 29

1 Answers1

25

You seem to be using local mode (with one driver that also acts as the only executor), but it should also be applicable to other clustered modes.

Enable the INFO logging level for BlockManagerMasterEndpoint to know how much memory Spark sees the property you set on the command line (as spark.driver.memory).

log4j.logger.org.apache.spark.storage.BlockManagerMasterEndpoint=INFO

When you start spark-shell --conf spark.driver.memory=2g you'll see the following:

$ ./bin/spark-shell --conf spark.driver.memory=2g
...
17/05/07 15:20:50 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.8:57177 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.8, 57177, None)

As you can see the available memory is 912.3 which is calculated as follows (see UnifiedMemoryManager.getMaxMemory):

// local mode with --conf spark.driver.memory=2g
scala> sc.getConf.getSizeAsBytes("spark.driver.memory")
res0: Long = 2147483648

scala> val systemMemory = Runtime.getRuntime.maxMemory

// fixed amount of memory for non-storage, non-execution purposes
val reservedMemory = 300 * 1024 * 1024

// minimum system memory required
val minSystemMemory = (reservedMemory * 1.5).ceil.toLong

val usableMemory = systemMemory - reservedMemory

val memoryFraction = sc.getConf.getDouble("spark.memory.fraction", 0.6)
scala> val maxMemory = (usableMemory * memoryFraction).toLong
maxMemory: Long = 956615884

import org.apache.spark.network.util.JavaUtils
scala> JavaUtils.byteStringAsMb(maxMemory + "b")
res1: Long = 912

Let's review how web UI calculates the memory (which is different from what's above and is supposed to just display the value!). That's the surprising part.

How the Storage Memory is displayed in web UI is controlled by the custom JavaScript function formatBytes in utils.js that (mapped to Scala) looks as follows:

def formatBytes(bytes: Double) = {
  val k = 1000
  val i = math.floor(math.log(bytes) / math.log(k))
  val maxMemoryWebUI = bytes / math.pow(k, i)
  f"$maxMemoryWebUI%1.1f"
}
scala> println(formatBytes(maxMemory))
956.6

956.6! That's exactly what web UI shows and is quite different from what Spark's UnifiedMemoryManager considers the available memory. Quite surprising, isn't it?


I think it's a bug and filled it as SPARK-20691.

Jacek Laskowski
  • 72,696
  • 27
  • 242
  • 420
  • 1
    Thanks for the very descriptive and exhaustive answer. The fragment causing the misunderstanding was `reservedMemory * 1.5` which should be documented somewhere (docs states that 300 MB are reserved, while in reality it's 450 MB). – Khozzy May 10 '17 at 07:51
  • Thanks for accepting the answer. Just a comment to yours: `reservedMemory` is 300MB, but it's minimum memory that's required for a Spark app, isn't it? – Jacek Laskowski May 10 '17 at 07:56
  • @JacekLaskowski so if I want to reduce storagememory what exact parametr I should put to configuration? – jk1 Jun 28 '18 at 07:48
  • 1
    @jk1 I think it'd be `--executor-memory` since a StorageManager lives there. Could you please ask a separate question so it won't falls through the cracks? Thanks. – Jacek Laskowski Jun 28 '18 at 19:50
  • @JacekLaskowski : You should add this answer to Spark Documentation. (reservedMemory * 1.5) this is written nowhere and it makes the entire calculation haywire and then you think your entire knowledge is wrong – dev Jul 01 '20 at 06:50
  • @ETL_Devs What a good idea! But then (after giving it a serious thought) I'm concerned I'd be left with no clients who'd pay for this precious knowledge – Jacek Laskowski Jul 02 '20 at 16:59
  • the `Storage Memory` example `20G/1G` , what is the 20G mean? It is the total usage since app start? thanks a lot – roamer Sep 25 '20 at 03:39
  • 1
    @roamer It's all ever to be used. – Jacek Laskowski Sep 25 '20 at 07:16
  • 1
    ```Runtime.getRuntime.maxMemory``` is equal to 2147483648, right? If so, then.. well, I do not know scala but, is the code given right? It should compute to 1048.8 that OP has in the question not 912 as you are getting. You have computed ```minSystemMemory``` but, not used it anywhere. In computing ```usableMemory``` If you use ```minSystemMemory```(450) instead of ```reservedMemory```(300) that you have used, you should get 958.8, not 912 or 956.6 – figs_and_nuts Jan 22 '22 at 00:02