如何通过Spark应用程序监视内存和CPU使用情况? [英] How can I monitor memory and CPU usage by spark application?
本文介绍了如何通过Spark应用程序监视内存和CPU使用情况?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
运行spark应用程序后,我想监视其内存和cpu使用情况以评估其性能,但是找不到任何选项.可以监视吗?如何通过Spark应用程序监视内存和CPU使用情况?
After running my spark application, I want to monitor its memory and cpu usage to evaluate its performance but couldn't find any option. Is it possible to monitor it? How can I monitor memory and CPU usage by spark application?
推荐答案
有一些选择:
- 冈里亚是一个人
- 如果您在自己的集群上运行,则HDP或Cloudera都具有实时CPU和内存消耗图表.
- 如果您想要特定的JVM指标,那么我建议 FlameGraph ,尽管它不是实时的
- 还有 Grafana ,它功能非常强大,您可以使用它跟踪许多指标,并且它是实时的. /li>
- Ganglia is one
- If you're running on your own cluster, HDP or Cloudera both have real time CPU & memory consumption charts.
- If you want specific JVM metrics, then I'd recommend FlameGraph, though it's not real time.
- There's also Grafana, it's extremely powerful, you can track many metrics with it, and it's real time.
这篇关于如何通过Spark应用程序监视内存和CPU使用情况?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文