星火无法找到JDBC驱动程序 [英] Spark Unable to find JDBC Driver

查看:249
本文介绍了星火无法找到JDBC驱动程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

所以,我一直在使用SBT与组装打包我所有的依赖关系为我工作的火花一个罐子。我有我在那里用C3P0设置连接池的信息几项工作,播出了出来,然后用foreachPartition的RDD来再抓一个连接,并且将数据插入到数据库。在我的SBT构建脚本,我包括

So I've been using sbt with assembly to package all my dependencies into a single jar for my spark jobs. I've got several jobs where I was using c3p0 to setup connection pool information, broadcast that out, and then use foreachPartition on the RDD to then grab a connection, and insert the data into the database. In my sbt build script, I include

"mysql" % "mysql-connector-java" % "5.1.33"

这确保了JDBC连接器打包的工作。一切都很正常。

This makes sure the jdbc connector is packaged up with the job. Everything works great.

所以,最近我开始与火花SQL玩弄并实现它更容易简单地采取一个数据帧,并将其与新的功能保存到一个JDBC源1.3.0

So recently I started playing around with spark sql and realized it's much easier to simply take a dataframe and save it to a jdbc source with the new features in 1.3.0

问题是,我得到了以下异常

The problem is, I'm getting the following exception

java.sql.SQLException: No suitable driver found for jdbc:mysql://some.domain.com/myschema?user=user&password=password
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:233)'

当我跑这地方我周围有通过设置

When I was running this locally I got around it by setting

SPARK_CLASSPATH=/path/where/mysql-connector-is.jar

最终什么,我想知道的是,为什么工作不能够发现它的时候都应该打包了它的驱动程序?我其他的工作从未有过这个问题。从我可以告诉既C3P0和数据框code既利用java.sql.DriverManager中(它负责处理你我可以告诉导入一切),因此它应该只是罚款?如果有什么是prevents从工作装配方法,我需要什么做的,使这项工作?

Ultimately what I'm wanting to know is, why is the job not capable of finding the driver when it should be packaged up with it? My other jobs never had this problem. From what I can tell both c3p0 and the dataframe code both make use of the java.sql.DriverManager (which handles importing everything for you from what I can tell) so it should work just fine?? If there is something that prevents the assembly method from working, what do I need to do to make this work?

推荐答案

此人有类似的问题:<一href=\"http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-DataFrame-with-MySQL-td22178.html\">http://apache-spark-user-list.1001560.n3.nabble.com/How-to-use-DataFrame-with-MySQL-td22178.html

你有你的连接器的驱动程序更新至最新版本?也没有指定驱动程序类,当你称为负载()?

Have you updated your connector drivers to the most recent version? Also did you specify the driver class when you called load()?

Map<String, String> options = new HashMap<String, String>();
options.put("url", "jdbc:mysql://localhost:3306/video_rcmd?user=root&password=123456");
options.put("dbtable", "video");
options.put("driver", "com.mysql.jdbc.Driver"); //here
DataFrame jdbcDF = sqlContext.load("jdbc", options); 

在火花/ conf目录/火花defaults.conf,您还可以设置spark.driver.extraClassPath和spark.executor.extraClassPath到你的MySQL驱动程序的路径的.jar

In spark/conf/spark-defaults.conf, you can also set spark.driver.extraClassPath and spark.executor.extraClassPath to the path of your MySql driver .jar

这篇关于星火无法找到JDBC驱动程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆