发现在星火JDBC没有合适的驱动程序 [英] No suitable driver found for jdbc in Spark
问题描述
我使用
df.write.mode("append").jdbc("jdbc:mysql://ip:port/database", "table_name", properties)
要插入表在MySQL中。
to insert into a table in MySQL.
另外,我已经加入的Class.forName(com.mysql.jdbc.Driver)
在我的code。
Also, I have added Class.forName("com.mysql.jdbc.Driver")
in my code.
当我提交申请星火:
spark-submit --class MY_MAIN_CLASS
--master yarn-client
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
这纱线客户机模式对我的作品。
This yarn-client mode works for me.
但是当我使用的纱线集群模式:
But when I use yarn-cluster mode:
spark-submit --class MY_MAIN_CLASS
--master yarn-cluster
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
它好好尝试的工作。我也尝试设置--conf
It doens't work. I also tried setting "--conf":
spark-submit --class MY_MAIN_CLASS
--master yarn-cluster
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
--conf spark.executor.extraClassPath=/path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
但仍然得到错误找到JDBC没有合适的驱动程序。
but still get the "No suitable driver found for jdbc" error.
推荐答案
有3个可能的解决方案,
There is 3 possible solutions,
- 您可能要组装您与您的构建管理器(Maven的,SBT),因此你不会需要添加依赖条件在
火花提交
CLI。应用 -
您可以使用下列选项在
火花提交
CLI:
- You might want to assembly you application with your build manager (Maven,SBT) thus you'll not need to add the dependecies in your
spark-submit
cli. You can use the following option in your
spark-submit
cli :
--jars $(echo ./lib/*.jar | tr ' ' ',')
说明:假定你有一个 LIB
目录中的所有的罐子在你的项目的根,这将读取所有的库并将它们添加应用程序提交。
Explanation : Supposing that you have all your jars in a lib
directory in your project root, this will read all the libraries and add them to the application submit.
您也可以尝试配置这些2个变量: spark.driver.extraClassPath
和 spark.executor.extraClassPath
在 SPARK_HOME / conf目录/火花default.conf
文件,并指定这些变量的jar文件的路径的价值。确保在工作节点存在相同的路径。
You can also try to configure these 2 variables : spark.driver.extraClassPath
and spark.executor.extraClassPath
in SPARK_HOME/conf/spark-default.conf
file and specify the value of these variables as the path of the jar file. Ensure that the same path exists on worker nodes.
这篇关于发现在星火JDBC没有合适的驱动程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!