K8s Spark Job JAR参数 [英] K8s Spark Job JAR params
本文介绍了K8s Spark Job JAR参数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用以下清单,而在应用时却遇到以下错误,这是传递JAR参数的正确方法吗?
I am using below manifest and while applying i am getting below error, is it a correct way to pass JAR arguments ?
apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command: [
"/bin/sh",
"-c",
"/opt/spark/bin/spark-submit \
--master k8s://EKSEndpoint \
--deploy-mode cluster \
--name spark-luluapp \
--class com.ll.jsonclass \
--conf spark.jars.ivy=/tmp/.ivy \
--conf spark.kubernetes.container.image=repo:buildversion \
--conf spark.kubernetes.namespace=spark-pi \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
--conf spark.kubernetes.driver.pod.name=spark-job-driver \
--conf spark.executor.instances=4 \
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] "
]
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4
将YAML转换为JSON时出错:yaml:第33行:找不到预期的','或']'
推荐答案
您的Yaml格式错误.试试这个.
your format of yaml is wrong. try this one.
apiVersion: batch/v1
kind: Job
metadata:
name: spark-on-eks
spec:
template:
spec:
containers:
- name: spark
image: repo:buildversion
command:
- "/bin/sh"
- "-c"
- '/opt/spark/bin/spark-submit \
--master k8s://EKSEndpoint \
--deploy-mode cluster \
--name spark-luluapp \
--class com.ll.jsonclass \
--conf spark.jars.ivy=/tmp/.ivy \
--conf spark.kubernetes.container.image=repo:buildversion \
--conf spark.kubernetes.namespace=spark-pi \
--conf spark.kubernetes.authenticate.driver.serviceAccountName=spark-sa \
--conf spark.hadoop.fs.s3a.impl=org.apache.hadoop.fs.s3a.S3AFileSystem \
--conf spark.kubernetes.authenticate.executor.serviceAccountName=spark-sa \
--conf spark.kubernetes.driver.pod.name=spark-job-driver \
--conf spark.executor.instances=4 \
local:///opt/spark/examples/App-buildversion-SNAPSHOT.jar \
[mks,env,reg,"dd.mm.yyyy","true","off","db-comp-results","true","XX","XXX","XXXXX","XXX",$$,###] '
serviceAccountName: spark-pi
restartPolicy: Never
backoffLimit: 4
这篇关于K8s Spark Job JAR参数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文