Spark-Submit:--packages 与--jars [英] Spark-Submit: --packages vs --jars
问题描述
有人可以在 spark-submit 脚本中解释 --packages
和 --jars
之间的区别吗?
Can someone explain the differences between --packages
and --jars
in a spark-submit script?
nohup ./bin/spark-submit --jars ./xxx/extrajars/stanford-corenlp-3.8.0.jar,./xxx/extrajars/stanford-parser-3.8.0.jar
--packages datastax:spark-cassandra-connector_2.11:2.0.7
--class xxx.mlserver.Application
--conf spark.cassandra.connection.host=192.168.0.33
--conf spark.cores.max=4
--master spark://192.168.0.141:7077 ./xxx/xxxanalysis-mlserver-0.1.0.jar 1000 > ./logs/nohup.out &
此外,如果依赖项在我的应用程序 pom.xml
中,我是否需要 --packages
配置?(我问是因为我只是通过更改 --packages
中的版本而炸毁了我的应用程序,而忘记在 pom.xml
中更改它)
Also, do I require the--packages
configuration if the dependency is in my applications pom.xml
? (I ask because I just blew up my applicationon by changing the version in --packages
while forgetting to change it in the pom.xml
)
我目前使用 --jars
是因为 jar 很大(超过 100GB),因此减慢了着色 jar 编译.我承认我不知道为什么我使用 --packages
除了因为我正在关注 datastax 文档
I am using the --jars
currently because the jars are massive (over 100GB) and thus slow down the shaded jar compilation. I admit I am not sure why I am using --packages
other than because I am following datastax documentation
推荐答案
如果你执行 spark-submit --help
它会显示:
if you do spark-submit --help
it will show:
--jars JARS Comma-separated list of jars to include on the driver
and executor classpaths.
--packages Comma-separated list of maven coordinates of jars to include
on the driver and executor classpaths. Will search the local
maven repo, then maven central and any additional remote
repositories given by --repositories. The format for the
coordinates should be groupId:artifactId:version.
如果是--jars
然后spark不会命中maven,但会在本地文件系统中搜索指定的jar,它还支持以下URL方案hdfs/http/https/ftp.
then spark doesn't hit maven but it will search specified jar in the local file system it also supports following URL scheme hdfs/http/https/ftp.
所以如果它是--packages
然后spark将在本地maven repo然后是中央maven repo或--repositories提供的任何repo中搜索特定包,然后下载它.
then spark will search specific package in local maven repo then central maven repo or any repo provided by --repositories and then download it.
现在回到你的问题:
此外,如果依赖项在我的应用程序 pom.xml 中,我是否需要 --packages 配置?
Ans:不,如果您不是直接在 jar 中导入/使用类,而是需要通过某些类加载器或服务加载器(例如 JDBC 驱动程序)加载类.是的,否则.
Ans: No, If you are not importing/using classes in jar directly but need to load classes by some class loader or service loader (e.g. JDBC Drivers). Yes otherwise.
顺便说一句,如果您在 pom.xml 中使用特定版本的特定 jar,那么为什么不制作应用程序的 uber/fat jar 或在 --jars 参数中提供依赖 jar ?而不是使用 --packages
BTW, If you are using specific version of specific jar in your pom.xml then why dont you make uber/fat jar of your application or provide dependency jar in --jars argument ? instead of using --packages
参考链接:
add-jars-to-a-spark-job-spark-submit
这篇关于Spark-Submit:--packages 与--jars的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!