Spark独立集群是否需要同类机器? [英] Does spark standalone cluster require homogeneous machines?

查看:72
本文介绍了Spark独立集群是否需要同类机器?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想在包括具有不同RAM数量的计算机的群集上运行graphx应用程序.目前,我尝试使用Spark独立集群.但是似乎我无法为每台机器调整spark.executor.memory属性.我对吗?

I want to run graphx app on cluster which includes machines with different amount of RAM. For now I tried to use spark standalone cluster. But it seems I cannot adjust spark.executor.memory property for each machine. Am I right?

推荐答案

对于我来说,如果手动设置选项spark.executor.memory,则将在至少具有您拥有的RAM数量的工作程序上启动您的任务手动给出.

For me, if you set the option spark.executor.memory manually, your tasks will be launched on the workers which have at least the amount of RAM you have manually given.

因此,通过使用此选项,您需要具有统一的群集.

So by using this option, you need to have an uniform cluster.

默认情况下,每个工作人员都使用其计算机上的所有可用RAM(对于操作系统为-1Go),并且您只能限制要使用的RAM数量.我不知道这是否是您使用调整"一词所要的.

By default, each worker uses all the RAM available on its machine (minus 1Go for the OS) and you can only limit the amount of RAM you want to use. I don't know if it's what you're looking for with the term "adjust".

http://spark.apache.org/docs/latest/spark-standalone.html (在SPARK_WORKER_MEMORY上搜索)

http://spark.apache.org/docs/latest/spark-standalone.html (search on SPARK_WORKER_MEMORY)

这篇关于Spark独立集群是否需要同类机器?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆