如何将Spark应用程序jar文件部署到Kubernetes集群? [英] How to deploy Spark application jar file to Kubernetes cluster?

查看:163
本文介绍了如何将Spark应用程序jar文件部署到Kubernetes集群?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在尝试在IBM Cloud上运行的Kubernetes集群上部署一个火花示例jar.

I am currently trying to deploy a spark example jar on a Kubernetes cluster running on IBM Cloud.

如果我尝试遵循以下关于在kubernetes集群上部署Spark的说明,,因为我总是收到错误消息,所以我无法启动Spark Pi:

If I try to follow these instructions to deploy spark on a kubernetes cluster, I am not able to launch Spark Pi, because I am always getting the error message:

系统找不到指定的文件

The system cannot find the file specified

输入密码后

bin/spark-submit \
    --master k8s://<url of my kubernetes cluster> \
    --deploy-mode cluster \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=5 \
    --conf spark.kubernetes.container.image=<spark-image> \
    local:///examples/jars/spark-examples_2.11-2.3.0.jar

我在examples/jars目录中的spark-examples_2.11-2.3.0.jar文件所在的正确目录中.

I am in the right directory with the spark-examples_2.11-2.3.0.jar file in the examples/jars directory.

推荐答案

确保your.jar文件位于容器映像中.

Ensure your.jar file is present inside the container image.

说明告诉它应该在那里:

最后,请注意,在上面的示例中,我们指定了一个带有 使用local://方案的特定URI.此URI是 示例已经在Docker映像中的罐子.

Finally, notice that in the above example we specify a jar with a specific URI with a scheme of local://. This URI is the location of the example jar that is already in the Docker image.

换句话说,local://方案已从local:///examples/jars/spark-examples_2.11-2.3.0.jar中删除,并且路径/examples/jars/spark-examples_2.11-2.3.0.jar有望在容器映像中可用.

In other words, local:// scheme is removed from local:///examples/jars/spark-examples_2.11-2.3.0.jar and the path /examples/jars/spark-examples_2.11-2.3.0.jar is expected to be available in a container image.

这篇关于如何将Spark应用程序jar文件部署到Kubernetes集群?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆