Web UI如何计算存储内存(在“执行器"选项卡中)? [英] How does web UI calculate Storage Memory (in Executors tab)?
问题描述
我试图了解Spark 2.1.0如何在节点上分配内存.
I'm trying to understand how Spark 2.1.0 allocates memory on nodes.
假设我正在启动一个本地PySpark REPL,为其分配2GB的内存:
Suppose I'm starting a local PySpark REPL assigning it 2GB of memory:
$ pyspark --conf spark.driver.memory=2g
Spark UI告知为存储内存分配了 956.6 MB :
Spark UI tells that there are 956.6 MB allocated for storage memory:
我不知道如何达到这个数字,这是我的思考过程:
I don't understand how to get to that number, this is my thinking process:
- 驱动程序堆大小设置为
2048 MB
, - 根据文档:
(2048 MB - 300 MB) * 0.6 = 1048.8 MB
用于执行区和存储区(统一的),
另外,在统一区域内的
1048.8 MB * 0.5 = 524.4 MB
应该保留为免疫存储区域
- Driver heap size is set to
2048 MB
, - According to docs:
(2048 MB - 300 MB) * 0.6 = 1048.8 MB
are used for both execution and storage regions (unified), - Additionally
1048.8 MB * 0.5 = 524.4 MB
within unified region should be reserved as immune storage region
那么,Spark中的956.6 MB值是如何实际计算的?
推荐答案
您似乎正在使用local
模式(一个驱动程序也充当唯一的执行程序),但它也应适用于其他群集模式.
You seem to be using local
mode (with one driver that also acts as the only executor), but it should also be applicable to other clustered modes.
为BlockManagerMasterEndpoint
启用INFO日志记录级别,以了解Spark看到您在命令行上设置的属性(如spark.driver.memory
)的内存量.
Enable the INFO logging level for BlockManagerMasterEndpoint
to know how much memory Spark sees the property you set on the command line (as spark.driver.memory
).
log4j.logger.org.apache.spark.storage.BlockManagerMasterEndpoint=INFO
启动spark-shell --conf spark.driver.memory=2g
时,您将看到以下内容:
When you start spark-shell --conf spark.driver.memory=2g
you'll see the following:
$ ./bin/spark-shell --conf spark.driver.memory=2g
...
17/05/07 15:20:50 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.8:57177 with 912.3 MB RAM, BlockManagerId(driver, 192.168.1.8, 57177, None)
As you can see the available memory is 912.3 which is calculated as follows (see UnifiedMemoryManager.getMaxMemory):
// local mode with --conf spark.driver.memory=2g
scala> sc.getConf.getSizeAsBytes("spark.driver.memory")
res0: Long = 2147483648
scala> val systemMemory = Runtime.getRuntime.maxMemory
// fixed amount of memory for non-storage, non-execution purposes
val reservedMemory = 300 * 1024 * 1024
// minimum system memory required
val minSystemMemory = (reservedMemory * 1.5).ceil.toLong
val usableMemory = systemMemory - reservedMemory
val memoryFraction = sc.getConf.getDouble("spark.memory.fraction", 0.6)
scala> val maxMemory = (usableMemory * memoryFraction).toLong
maxMemory: Long = 956615884
import org.apache.spark.network.util.JavaUtils
scala> JavaUtils.byteStringAsMb(maxMemory + "b")
res1: Long = 912
让我们回顾一下Web UI如何计算内存(这与上面的内容有所不同,并且应该仅显示值!).那是令人惊讶的部分.
Let's review how web UI calculates the memory (which is different from what's above and is supposed to just display the value!). That's the surprising part.
由
956.6 !这正是Web UI所显示的内容,与Spark的 956.6! That's exactly what web UI shows and is quite different from what Spark's 我认为这是一个错误,将其填充为 SPARK-20691 . I think it's a bug and filled it as SPARK-20691. 这篇关于Web UI如何计算存储内存(在“执行器"选项卡中)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!UnifiedMemoryManager
认为可用内存的内容有很大不同. 非常令人惊讶,不是吗? UnifiedMemoryManager
considers the available memory. Quite surprising, isn't it?