Apache Spark Rest API [英] Apache spark rest API

查看:375
本文介绍了Apache Spark Rest API的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用具有log4j属性的spark-submit命令来调用Spark-submit,如下所示:

I'm using the spark-submit command I have for the log4j properties to invoke a Spark-submit like this:

/opt/spark-1.6.2-bin-hadoop2.6/bin/spark-submit \
--driver-java-options \
"-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties\ --class Test testing.jar

我该怎么做--driver-java-options通过curl(Apache Spark的Hidden REST API)提交作业?

How do I do --driver-java-options, to submit a job via curl (Apache Spark's Hidden REST API)?

我尝试过:

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"spark.driver.extraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'

工作已成功提交并给出了答复,但有一个uknownField:

Job submitted successfully and response was given, but with one uknownField:

{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160810210057-0091",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160810210057-0091",
  "success" : true,
  "unknownFields" : [ "spark.driver.extraJavaOptions" ]
}

"unknownFields" : [ "spark.driver.extraJavaOptions" ]

我还尝试了driverExtraJavaOptions如下:

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"driverExtraJavaOptions" : "-Dlog4j.configuration=file:/home/test_api/log4j-driver.properties",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://host-ip:7077"
}
}'

但是得到了类似的答复:

But got a similar response:

{
  "action" : "CreateSubmissionResponse",
  "message" : "Driver successfully submitted as driver-20160810211432-0094",
  "serverSparkVersion" : "1.6.2",
  "submissionId" : "driver-20160810211432-0094",
  "success" : true,
  "unknownFields" : [ "driverExtraJavaOptions" ]
}

这是为什么?
我看着 spark-submit.scala ,并引用了

Why is this?
I looked at spark-submit.scala and referenced the Spark REST API

推荐答案

现在可以正常使用Dlog4j.configuration = file:///(本地文件的路径),并将spark.driver.extraJavaOptions放入其中sparkProperties

curl -X POST http://host-ip:6066/v1/submissions/create --header "Content-Type:application/json;charset=UTF-8" --data '{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "" ],
"appResource" : "hdfs://host-ip:9000/test/testing.jar",
"clientSparkVersion" : "1.6.2",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "Test",
"sparkProperties" : {
"spark.jars" : "hdfs://host-ip:9000/test/testing.jar",
"spark.driver.extraJavaOptions" : "-Dlog4j.configuration=file:///home/log4j-driver.properties",
"spark.app.name" : "Test",
"spark.eventLog.enabled": "true",
"spark.eventLog.dir": "hdfs://host-ip:9000/test/spark-events",
"spark.submit.deployMode" : "client",
"spark.master" : "spark://host-ip:7077"
}
}'

这篇关于Apache Spark Rest API的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆