如何通过Spark REST API获取所有作业状态? [英] How to get all jobs status through spark REST API?
问题描述
我正在使用spark 1.5.1,我想通过REST API检索所有作业状态.
I am using spark 1.5.1 and I'd like to retrieve all jobs status through REST API.
使用/api/v1/applications/{appId}
我得到正确的结果.但是在访问作业/api/v1/applications/{appId}/jobs
时得到没有这样的应用:{appID}" 响应.
I am getting correct result using /api/v1/applications/{appId}
. But while accessing jobs /api/v1/applications/{appId}/jobs
getting "no such app:{appID}" response.
如何在此处传递应用程序ID以使用Spark REST API检索应用程序的作业状态?
How should I pass app ID here to retrieve jobs status of application using spark REST API?
推荐答案
Spark提供了4个隐藏的RESTFUL API
Spark provides 4 hidden RESTFUL API
1)提交作业-curl -X POST http://SPARK_MASTER_IP:6066/v1/submissions/创建
1) Submit the job - curl -X POST http://SPARK_MASTER_IP:6066/v1/submissions/create
2)要终止工作-curl -X POST http://SPARK_MASTER_IP:6066/v1/submissions/kill/driver-id
2) To kill the job - curl -X POST http://SPARK_MASTER_IP:6066/v1/submissions/kill/driver-id
3)要检查作业的状态-卷曲 http://SPARK_MASTER_IP:6066/v1/submissions/status/driver-id
3) To check status if the job - curl http://SPARK_MASTER_IP:6066/v1/submissions/status/driver-id
4)Spark群集的状态- http://SPARK_MASTER_IP:8080/json/
4) Status of the Spark Cluster - http://SPARK_MASTER_IP:8080/json/
如果要使用其他API,可以尝试Livy,lucidworks 网址- https://doc.lucidworks.com/fusion/3.0/Spark_ML/Spark-Getting-Started.html
If you want to use another APIs you can try Livy , lucidworks url - https://doc.lucidworks.com/fusion/3.0/Spark_ML/Spark-Getting-Started.html
这篇关于如何通过Spark REST API获取所有作业状态?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!