使用Spark SQL时找不到Spark Logging类 [英] Getting Spark Logging class not found when using Spark SQL
问题描述
我正在尝试用Java进行简单的Spark SQL编程。在程序中,我从Cassandra表中获取数据,将 RDD
转换为数据集
并显示数据。当我运行 spark-submit
命令时,我收到错误: java.lang.ClassNotFoundException:org.apache.spark.internal.Logging
。
I am trying to do a simple Spark SQL programming in Java. In the program, I am getting data from a Cassandra table, converting the RDD
into a Dataset
and displaying the data. When I run the spark-submit
command, I am getting the error: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging
.
我的计划是:
SparkConf sparkConf = new SparkConf().setAppName("DataFrameTest")
.set("spark.cassandra.connection.host", "abc")
.set("spark.cassandra.auth.username", "def")
.set("spark.cassandra.auth.password", "ghi");
SparkContext sparkContext = new SparkContext(sparkConf);
JavaRDD<EventLog> logsRDD = javaFunctions(sparkContext).cassandraTable("test", "log",
mapRowTo(Log.class));
SparkSession sparkSession = SparkSession.builder().appName("Java Spark SQL").getOrCreate();
Dataset<Row> logsDF = sparkSession.createDataFrame(logsRDD, Log.class);
logsDF.show();
我的POM依赖关系是:
My POM dependencies are:
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.datastax.spark</groupId>
<artifactId>spark-cassandra-connector_2.11</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.2</version>
</dependency>
</dependencies>
我的 spark-submit
命令是: /home/ubuntu/spark-2.0.2-bin-hadoop2.7/bin/spark-submit --classcom.jtv.spark.dataframes.App--master local [4] spark。 dataframes-0.1-jar-with-dependencies.jar
如何解决此错误?降级到 1.5.2
不起作用 1.5.2
没有 org.apache .spark.sql.Dataset
和 org.apache.spark.sql.SparkSession
。
How do I solve this error? Downgrading to 1.5.2
does not work as 1.5.2
does not have org.apache.spark.sql.Dataset
and org.apache.spark.sql.SparkSession
.
推荐答案
这里聚会很晚,但我添加了
Pretty late to the party here, but I added
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.1</version>
<scope>provided</scope>
</dependency>
解决此问题。似乎适用于我的案例。
To solve this issue. Seems to work for my case.
这篇关于使用Spark SQL时找不到Spark Logging类的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!