如何将Spark应用程序jar文件部署到Kubernetes集群? [英] How to deploy Spark application jar file to Kubernetes cluster?
问题描述
我目前正在尝试在IBM Cloud上运行的Kubernetes集群上部署一个火花示例jar.
I am currently trying to deploy a spark example jar on a Kubernetes cluster running on IBM Cloud.
如果我尝试遵循以下关于在kubernetes集群上部署Spark的说明,,因为我总是收到错误消息,所以我无法启动Spark Pi:
If I try to follow these instructions to deploy spark on a kubernetes cluster, I am not able to launch Spark Pi, because I am always getting the error message:
系统找不到指定的文件
The system cannot find the file specified
输入密码后
bin/spark-submit \
--master k8s://<url of my kubernetes cluster> \
--deploy-mode cluster \
--name spark-pi \
--class org.apache.spark.examples.SparkPi \
--conf spark.executor.instances=5 \
--conf spark.kubernetes.container.image=<spark-image> \
local:///examples/jars/spark-examples_2.11-2.3.0.jar
我在examples/jars
目录中的spark-examples_2.11-2.3.0.jar
文件所在的正确目录中.
I am in the right directory with the spark-examples_2.11-2.3.0.jar
file in the examples/jars
directory.
推荐答案
确保your.jar
文件位于容器映像中.
Ensure your.jar
file is present inside the container image.
说明告诉它应该在那里:
最后,请注意,在上面的示例中,我们指定了一个带有 使用local://方案的特定URI.此URI是 示例
已经在Docker映像中的罐子.
Finally, notice that in the above example we specify a jar with a specific URI with a scheme of local://. This URI is the location of the example jar that is already in the Docker image.
换句话说,local://
方案已从local:///examples/jars/spark-examples_2.11-2.3.0.jar
中删除,并且路径/examples/jars/spark-examples_2.11-2.3.0.jar
有望在容器映像中可用.
In other words, local://
scheme is removed from local:///examples/jars/spark-examples_2.11-2.3.0.jar
and the path /examples/jars/spark-examples_2.11-2.3.0.jar
is expected to be available in a container image.
这篇关于如何将Spark应用程序jar文件部署到Kubernetes集群?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!