通过spark-submit将其他罐子传递给Spark [英] Passing additional jars to Spark via spark-submit

查看:270
本文介绍了通过spark-submit将其他罐子传递给Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在MongoDB中使用了Spark,因此依赖于 mongo-hadoop 驱动程序。我得到的东西感谢输入我的原始问题这里

我的Spark工作正在运行,但是,我收到了我不明白的警告。当我运行这个命令的时候

  $ SPARK_HOME / bin / spark-submit --driver-class-path / usr / local / share /mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT。 jar --jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo- hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py 

它的工作原理,但给我以下警告消息


警告:本地jar
/ usr / local / share / mongo-hadoop / build / libs / mongo- hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar
不存在,跳过。

当我试图完成这项工作时,如果在提交作业时忽略了这些路径,那么它根本无法运行。然而,现在,如果我忽略了这些路径,它会运行

  $ SPARK_HOME / bin / spark-submit my_application.py 

有人能解释一下这里发生了什么吗?我在这里查看了类似的问题,引用了相同的警告,并通过文档进行了搜索。

通过设置选项,它们将被存储为环境变量或其他东西?我很高兴它有效,但要小心,我不完全明白为什么有时而不是其他人。 code> JARS 应该用逗号分隔:

  $ SPARK_HOME / bin / spark-submit \ 
--driver-class-path /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop /spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar \
--jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5。 0-SNAPSHOT.jar,/ usr / local / share / mongo-hadoop / spark / build / libs / mongo-hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py


I'm using Spark with MongoDB, and consequently rely on the mongo-hadoop drivers. I got things working thanks to input on my original question here.

My Spark job is running, however, I receive warnings that I don't understand. When I run this command

$SPARK_HOME/bin/spark-submit --driver-class-path /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar --jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py

it works, but gives me the following warning message

Warning: Local jar /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar does not exist, skipping.

When I was trying to get this working, if I left out those paths when submitting the job it wouldn't run at all. Now, however, if I leave out those paths it does run

$SPARK_HOME/bin/spark-submit  my_application.py

Can someone please explain what is going on here? I have looked through similar questions here referencing the same warning, and searched through the documentation.

By setting the options once are they stored as environment variables or something? I'm glad it works, but wary that I don't fully understand why sometimes and not others.

解决方案

The problem is that CLASSPATH should be colon separated, while JARS should be comma separated:

$SPARK_HOME/bin/spark-submit \
--driver-class-path /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar:/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar \
--jars /usr/local/share/mongo-hadoop/build/libs/mongo-hadoop-1.5.0-SNAPSHOT.jar,/usr/local/share/mongo-hadoop/spark/build/libs/mongo-hadoop-spark-1.5.0-SNAPSHOT.jar my_application.py

这篇关于通过spark-submit将其他罐子传递给Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆