如何从命令行查看Spark应用程序的状态? [英] How to check status of Spark applications from the command line?
本文介绍了如何从命令行查看Spark应用程序的状态?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
要检查Apache Spark中正在运行的应用程序,可以从URL上的Web界面检查它们:
To check running applications in Apache spark, one can check them from the web interface on the URL:
http://<master>:8080
我的问题是我们如何从终端检查正在运行的应用程序,是否有任何命令可以返回应用程序状态?
My question how we can check running applications from terminal, is there any command that returns applications status?
推荐答案
如果用于Spark Standalone或Apache Mesos集群管理器,请 @ sb0709答案是遵循的方式.
If it's for Spark Standalone or Apache Mesos cluster managers, @sb0709's answer is the way to follow.
对于YARN,您应该使用纱线应用命令:
For YARN, you should use yarn application command:
$ yarn application -help
usage: application
-appStates <States> Works with -list to filter applications
based on input comma-separated list of
application states. The valid application
state can be one of the following:
ALL,NEW,NEW_SAVING,SUBMITTED,ACCEPTED,RUN
NING,FINISHED,FAILED,KILLED
-appTypes <Types> Works with -list to filter applications
based on input comma-separated list of
application types.
-help Displays help for all commands.
-kill <Application ID> Kills the application.
-list List applications. Supports optional use
of -appTypes to filter applications based
on application type, and -appStates to
filter applications based on application
state.
-movetoqueue <Application ID> Moves the application to a different
queue.
-queue <Queue Name> Works with the movetoqueue command to
specify which queue to move an
application to.
-status <Application ID> Prints the status of the application.
这篇关于如何从命令行查看Spark应用程序的状态?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文