PySpark 找不到适合 jdbc:mysql://dbhost 的驱动程序 [英] PySpark No suitable driver found for jdbc:mysql://dbhost

查看:65
本文介绍了PySpark 找不到适合 jdbc:mysql://dbhost 的驱动程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将我的数据框写入 mysql 表.当我尝试编写时,我得到 No合适的驱动程序为 jdbc:mysql://dbhost 找到.

I am trying to write my dataframe to a mysql table. I am getting No suitable driver found for jdbc:mysql://dbhost when I try write.

作为预处理的一部分,我从同一个数据库中的其他表中读取数据,这样做没有任何问题.我可以完整运行并将行保存到镶木地板文件中,因此它肯定是从 mysql 数据库中读取的.

As part of the preprocessing I read from other tables in the same DB and have no issues doing that. I can do the full run and save the rows to a parquet file so it is definitely reading from the mysql DB.

我提交使用:

spark-submit --conf spark.executor.extraClassPath=/home/user/Downloads/mysql-connector-java-5.1.35-bin.jar --driver-class-path/home/user/下载/mysql-connector-java-5.1.35-bin.jar --jars/home/user/Downloads/mysql-connector-java-5.1.35-bin.jar main.py

我正在使用:

df.write.jdbc(url="jdbc:mysql://dbhost/dbname", table="tablename", mode="append", properties={"user":"dbuser", "密码": "s3cret"})

推荐答案

这是一个与类加载器相关的错误.这是它的票:https://issues.apache.org/jira/browse/SPARK-8463 这是它的拉取请求:https://github.com/apache/spark/pull/6900.

This is a bug related the the classloader. This is the ticket for it: https://issues.apache.org/jira/browse/SPARK-8463 and this is the pull request for it: https://github.com/apache/spark/pull/6900.

一种解决方法是将 mysql-connector-java-5.1.35-bin.jar 复制到每台机器上与驱动程序相同的位置.

A workaround is to copy mysql-connector-java-5.1.35-bin.jar to every machine at the same location as it is on the driver.

这篇关于PySpark 找不到适合 jdbc:mysql://dbhost 的驱动程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆