如何通过Spark应用程序获取内存和CPU使用率? [英] How to get memory and cpu usage by a Spark application?

查看:120
本文介绍了如何通过Spark应用程序获取内存和CPU使用率?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想获取Spark作业的平均资源利用率以进行监视,请问如何轮询Spark应用程序的资源(即CPU和内存利用率)?

I want to get the average resource utilization of a spark job for monitoring purposes, how can I poll the resource ie cpu and memory utilization of a Spark Application.?

推荐答案

您必须从YARN中提取日志

You have to pull the logs from YARN

命令行:: yarn application -logs {YourAppID} 您可以从spark作业堆栈或 yarn application -list <中获取applicationID./code>命令或从UI.有关yarn命令的更多信息,请此处

Command line : yarn application -logs {YourAppID} You can get the applicationID from the stack of the spark job or from the yarn application -list command or from UI too. More on the yarn commands are here

FROM用户界面:如果您使用的是Cloudera,则可以从 http://$ {LOCALHOST}:7180/cmf/services/17/applications 看到使用 http://$ {LOCALHOST}:8088/cluster

FROM UI : If you are using Cloudera you can see from http://${LOCALHOST}:7180/cmf/services/17/applications you can get to the DAG with http://${LOCALHOST}:8088/cluster

这篇关于如何通过Spark应用程序获取内存和CPU使用率?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆