spark提交在类路径中添加多个jar [英] spark submit add multiple jars in classpath

查看:75
本文介绍了spark提交在类路径中添加多个jar的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试运行一个 Spark 程序,其中我有多个 jar 文件,如果我只有一个 jar,我将无法运行.我想添加位于同一位置的两个 jar 文件.我已经尝试了以下但它显示一个依赖错误

I am trying to run a spark program where i have multiple jar files, if I had only one jar I am not able run. I want to add both the jar files which are in same location. I have tried the below but it shows a dependency error

spark-submit \
  --class "max" maxjar.jar Book1.csv test \
  --driver-class-path /usr/lib/spark/assembly/lib/hive-common-0.13.1-cdh​5.3.0.jar

如何在同一目录中添加另一个 jar 文件?

How can i add another jar file which is in the same directory?

我想添加/usr/lib/spark/assembly/lib/hive-serde.jar.

推荐答案

我试图从使用 spark-submit 执行的 python 代码连接到 mysql.

I was trying to connect to mysql from the python code that was executed using spark-submit.

我使用的是使用 Ambari 的 HDP 沙箱.尝试了很多选项,例如 --jars--driver-class-path 等,但都没有奏效.

I was using HDP sandbox that was using Ambari. Tried lot of options such as --jars, --driver-class-path, etc, but none worked.

复制/usr/local/miniconda/lib/python2.7/site-packages/pyspark/jars/

截至目前,我不确定这是一个解决方案还是一个快速破解,但由于我正在研究 POC,所以它对我有用.

As of now I'm not sure if it's a solution or a quick hack, but since I'm working on POC so it kind of works for me.

这篇关于spark提交在类路径中添加多个jar的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆