YARN REST API - Spark作业提交 [英] YARN REST API - Spark job submission

查看:1581
本文介绍了YARN REST API - Spark作业提交的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图使用YARN REST API来提交spark-submit作业,这通常是通过命令行运行的。



我的命令行spark-submit looks像这样


$ b $ pre $ JAVA_HOME = / usr / local / java7 / HADOOP_CONF_DIR = / etc / hadoop / conf / usr / local / spark- 1.5 / bin / spark-submit \
--driver-class-path/ etc / hadoop / conf\
--class MySparkJob \
--master yarn-cluster \
--confspark.executor.extraClassPath = / usr / local / hadoop / client / hadoop- *\
--confspark.driver.extraClassPath = / usr / local / hadoop / client / hadoop- *\
spark-job.jar --retry false --counter 10

阅读YARN REST API文档 https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Applications_APISub mit_Application ,我试图创建JSON负载到POST,看起来像

  {
am- container-spec:{
commands:{
command:JAVA_HOME = / usr / local / java7 / HADOOP_CONF_DIR = / etc / hadoop / conf org.apache.hadoop.yarn.applications .distributedshell.ApplicationMaster --jar spark-job.jar --class MySparkJob --arg --retry --arg false --arg --counter --arg 10
},
local-资源:{
entry:[
{
key:spark-job.jar,
value:{
resource :hdfs:///spark-job.jar,
size:3214567,
timestamp:1452408423000,
type:FILE,
visibility:APPLICATION
}
}
]
}
},
application-id:application_11111111111111_0001,
application-name:test,
application-type:Spark
}

我看到的问题是,hadoop configs目录以前是本地运行作业的机器,现在我通过REST API提交作业,并直接在RM上运行,我不知道如何提供这些细节?

解决方案

如果您尝试通过REST API提交spark任务,我会建议看看 Livy 。它是向集群提交spark工作的简单和最简单的方式。

Livy是一个开源的REST接口,可以随时随地与Apache Spark进行交互。它支持在本地或Apache Hadoop YARN中运行的Spark上下文中执行代码或程序片段。


  • 交互式Scala,Python和R shell

  • Scala,Java,Python中的批量提交

  • 多个用户可以共享相同的服务器(模拟支持)
  • 可用于从任何地方使用REST提交作业

  • 不需要对程序进行任何代码更改



我们也尝试通过Java RMI选项提交应用程序。


I am trying to use the YARN REST API to submit the spark-submit jobs, which I generally run via command line.

My command line spark-submit looks like this

JAVA_HOME=/usr/local/java7/ HADOOP_CONF_DIR=/etc/hadoop/conf /usr/local/spark-1.5/bin/spark-submit \
--driver-class-path "/etc/hadoop/conf" \
--class MySparkJob \
--master yarn-cluster \
--conf "spark.executor.extraClassPath=/usr/local/hadoop/client/hadoop-*" \
--conf "spark.driver.extraClassPath=/usr/local/hadoop/client/hadoop-*" \
spark-job.jar --retry false --counter 10

Reading through the YARN REST API documentation https://hadoop.apache.org/docs/stable/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Applications_APISubmit_Application, I tried to create the JSON payload to POST which looks like

{
  "am-container-spec": {
    "commands": {
      "command": "JAVA_HOME=/usr/local/java7/ HADOOP_CONF_DIR=/etc/hadoop/conf org.apache.hadoop.yarn.applications.distributedshell.ApplicationMaster  --jar spark-job.jar --class MySparkJob --arg --retry --arg false --arg --counter --arg 10"
    }, 
    "local-resources": {
      "entry": [
        {
          "key": "spark-job.jar", 
          "value": {
            "resource": "hdfs:///spark-job.jar", 
            "size": 3214567, 
            "timestamp": 1452408423000, 
            "type": "FILE", 
            "visibility": "APPLICATION"
          }
        }
      ]
    }
  }, 
  "application-id": "application_11111111111111_0001", 
  "application-name": "test",
  "application-type": "Spark"   
}

The problem I see is that, the hadoop configs directory is previously local to the machine I was running jobs from, now that I submit job via REST API and it runs directly on the RM, I am not sure how to provide these details ?

解决方案

If you are trying to submit spark job via REST APIs, I will suggest to have a look at Livy. Its a simple and easiest way to submit spark jobs to cluster.

Livy is an open source REST interface for interacting with Apache Spark from anywhere. It supports executing snippets of code or programs in a Spark context that runs locally or in Apache Hadoop YARN.

  • Interactive Scala, Python and R shells
  • Batch submissions in Scala, Java, Python
  • Multiple users can share the same server (impersonation support)
  • Can be used for submitting jobs from anywhere with REST
  • Does not require any code change to your programs

We've also tried submitting application through Java RMI option.

这篇关于YARN REST API - Spark作业提交的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆