spark.python.worker.memory 与 spark.executor.memory 有何关系? [英] How does spark.python.worker.memory relate to spark.executor.memory?
问题描述
这张图相当明确不同 YARN 和 Spark 内存相关设置之间的关系,除非涉及 spark.python.worker.memory
.
This diagram is quite clear on the relationship between the different YARN and Spark memory related settings, except when it comes to spark.python.worker.memory
.
spark.python.worker.memory
如何适应这种内存模型?
How does spark.python.worker.memory
fit into this memory model?
Python 进程是否由 spark.executor.memory
或 yarn.nodemanager.resource.memory-mb
管理?
Are the Python processes governed by spark.executor.memory
or yarn.nodemanager.resource.memory-mb
?
更新
这个问题解释了设置的作用,但没有回答关于内存治理的问题,或者它与其他内存设置的关系.
This question explains what the setting does, but doesn't answer the question concerning the memory governance, or how it relates to other memory settings.
推荐答案
从 Apache-spark 邮件列表中找到这个线程,看起来 spark.python.worker.memory是来自 spark.executor.memory 的内存子集.
Found this thread from the Apache-spark mailing list, and it appears that spark.python.worker.memory is a subset of the memory from spark.executor.memory.
来自线程:spark.python.worker.memory 用于执行器中的 Python worker"
From the thread: "spark.python.worker.memory is used for Python worker in executor"
这篇关于spark.python.worker.memory 与 spark.executor.memory 有何关系?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!