SPARK:如何监视Spark群集上的内存消耗? [英] SPARK: How to monitor the memory consumption on Spark cluster?

查看:125
本文介绍了SPARK:如何监视Spark群集上的内存消耗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

对不起这个基本问题,但我自己却不知道.

Sorry for the basic question, but i couldn't figured it out by myself.

我试图在Spark UI上找出每个工作程序和驱动程序上有多少可用内存.

I was trying to figure out on Spark UI how much memory is available and used on each worker and the driver.

是否有任何直接且简单的方式来监视此信息?

Is there any straight-forward and simple way to monitor this information?

我的目标是根据我在工人和驾驶员上占据的数据量来决定我的持久性策略.

My goal is to decide my persistence strategy, according to how much my data occupy on the workers and on the driver.

P.S.我在Spark 1.6.1上使用独立模式

P.S. I am using standalone mode, on Spark 1.6.1

推荐答案

我认为在执行者"选项卡中,您将获得所需的信息.如果您有想法,可以在 http://localhost:4040/executors/中找到它 最好!

I think in the Executors tab you will have the info that you need. If you have spark up you will find it in http://localhost:4040/executors/ Best!

这篇关于SPARK:如何监视Spark群集上的内存消耗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆