设置"spark.memory.storageFraction".在Spark中不起作用 [英] Setting "spark.memory.storageFraction" in Spark does not work

查看:63
本文介绍了设置"spark.memory.storageFraction".在Spark中不起作用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试调整Spark的内存参数.我试过了:

I am trying to tune the memory parameter of Spark. I tried:

sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created

提交作业并检查Spark UI后.我发现存储内存"仍然像以前一样.因此上述方法无效.

After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work.

设置"spark.memory.storageFraction"的正确方法是什么?

What is the correct way to set "spark.memory.storageFraction"?

我正在使用Spark 2.0.

I am using Spark 2.0.

推荐答案

我遇到了同样的问题,从spark github读取了一些代码后,我认为spark ui上的存储内存"具有误导性,这并不表示存储区域的大小,实际上它代表的是maxMemory:

I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the size of storage region,actually it represent the maxMemory:

maxMemory =  (executorMemory - reservedMemory[default 384]) * memoryFraction[default 0.6]

检查这些以获取更多详细信息↓↓↓

check these for more detail ↓↓↓

getMaxmemory源代码

这篇关于设置"spark.memory.storageFraction".在Spark中不起作用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆