Spark:线程"main"中的异常; java.lang.ClassNotFoundException:com.mysql.jdbc.Driver [英] Spark : Exception in thread "main" java.lang.ClassNotFoundException: com.mysql.jdbc.Driver

查看:167
本文介绍了Spark:线程"main"中的异常; java.lang.ClassNotFoundException:com.mysql.jdbc.Driver的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在spark中编写了一个简单程序,将数据帧写入mySql中的表中.

I wrote simple program in spark to write a dataframe to table in mySql.

程序如下:

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.hive.HiveContext

import org.apache.spark.SparkContext._
import org.apache.spark.rdd._
//import org.apache.spark.rdd.RDD

import org.apache.spark.sql.types._
import org.apache.spark.sql.Row;

import java.util.Properties

import java.sql.{ Connection, DriverManager, SQLException }

object MySQLTrial {
  def main(args: Array[String]) {
    val sparkConf = new SparkConf().setAppName("AnalyseBusStreaming")
    val sc = new SparkContext(sparkConf)
    val df = sc.parallelize(Array((1, 234), (2, 1233)))
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    import sqlContext.implicits._
    val prop = new Properties()
    prop.put("user", "admin")
    prop.put("password", "admin")

    val driver = "com.mysql.jdbc.Driver"
    Class.forName(driver)
    val dfWriter = df.toDF().write.mode("append")
    dfWriter.jdbc("jdbc:mysql://127.0.0.1:3306/IOT_DB", "table1", prop)
  }
}

我的项目的POM文件如下

The POM file for my project is as follows

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>
    <groupId>ggi.bigdata</groupId>
    <artifactId>internet_of_things</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.6.0</version>
        </dependency>
        <dependency>
            <groupId>mysql</groupId>
            <artifactId>mysql-connector-java</artifactId>
            <version>5.1.38</version>
        </dependency>
    </dependencies>
</project>

我正在使用spark提交运行此程序(在本地和毛线模式下尝试).我没有明确包含任何jar文件来运行此代码.我不断收到错误消息:

I'm running this program using spark submit (tried on local and yarn mode). I haven't included any jar files explicitly to run this code. I keep getting the error :

线程主"中的异常java.lang.ClassNotFoundException:com.mysql.jdbc.Driver

对此应该怎么做?

推荐答案

这是因为您的驱动程序不在您要提交给集群的uber-jar中,无论它是独立集群,yarn还是mesos等.

This is because your driver isn't present in the uber-jar that you are submitting to the cluster whether it's a standalone cluster or yarn or mesos, etc.

解决方案1::由于您使用的是maven,因此可以使用Assembly插件来构建具有所有必需依赖项的uber-jar.有关 maven程序插件的更多信息.

Solution 1 : Since you are using maven, you can use the assembly plugin to build your uber-jar with all the needed dependencies. More information about maven assembly plugin here.

解决方案2::使用--jars选项提交应用程序时,请在运行时提供这些依赖库.我建议您阅读有关高级依赖项管理的信息. a>和提交申请在官方文档中.

Solution 2 : Provide these dependency libraries on runtime when you submit your application using the --jars option. I advice your to read ore information about advanced dependencies management and submitting applications in the official documentation.

例如,它看起来可能像这样:

e.g it can look like this :

./bin/spark-submit \
  --class <main-class>
  --master <master-url> \
  --jars /path/to/mysql-connector-java*.jar

我希望这会有所帮助!

这篇关于Spark:线程"main"中的异常; java.lang.ClassNotFoundException:com.mysql.jdbc.Driver的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆