Spark和Cassandra Java应用程序:线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset [英] Spark and Cassandra Java application: Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset

查看:2036
本文介绍了Spark和Cassandra Java应用程序:线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我得到了一个令人惊讶的siplme java应用程序,我几乎从这个例子中复制过: http://markmail.org/download.xqy?id=zua6upabiylzeetp&number=2

I got an amazingly siplme java application which I almost copied from this one example: http://markmail.org/download.xqy?id=zua6upabiylzeetp&number=2

我想做的就是阅读表数据和在Eclipse控制台中显示。

All I wanted to do is to read the table data and display in the Eclipse console.

我的pom.xml:

        <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>chat_connaction_test</groupId>
  <artifactId>ChatSparkConnectionTest</artifactId>
  <version>0.0.1-SNAPSHOT</version>
 <dependencies> 
    <dependency>
    <groupId>com.datastax.cassandra</groupId>
    <artifactId>cassandra-driver-core</artifactId>
    <version>3.1.0</version>
    </dependency>

    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>2.0.0</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/com.datastax.spark/spark-cassandra-connector_2.10 -->
    <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>2.0.0-M3</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-streaming_2.10 -->
    <dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming_2.10</artifactId>
    <version>2.0.0</version>
    </dependency>
    <!--
    <dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-hive_2.10</artifactId> 
    <version>1.5.2</version> 
    </dependency>
    -->
  </dependencies>
</project>

我的java代码:

    package com.chatSparkConnactionTest;

import static com.datastax.spark.connector.japi.CassandraJavaUtil.javaFunctions;
import java.io.Serializable;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import com.datastax.spark.connector.japi.CassandraRow;

public class JavaDemo implements Serializable {
    private static final long serialVersionUID = 1L;
    public static void main(String[] args) {

        SparkConf conf = new SparkConf().
            setAppName("chat").
            setMaster("local").
            set("spark.executor.memory","1g").
            set("spark.cassandra.connection.host", "127.0.0.1");
        JavaSparkContext sc = new JavaSparkContext(conf);

        JavaRDD<String> cassandraRowsRDD = javaFunctions(sc).cassandraTable(
            "chat", "dictionary")

            .map(new Function<CassandraRow, String>() {
                @Override
                public String call(CassandraRow cassandraRow) throws Exception {
                    String tempResult = cassandraRow.toString();
                    System.out.println(tempResult);
                    return tempResult;
                    }
                }
            );
        System.out.println("Data as CassandraRows: \n" + 
        cassandraRowsRDD.collect().size()); // THIS IS A LINE WITH ERROR
    } 
}

这里是我的错误:


16/10/05 20:49:18 INFO CassandraConnector:连接到Cassandra
集群:测试集群异常在线程main
java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset at
java.lang.Class.getDeclaredMethods0(Native Method)at
java.lang.Class .privateGetDeclaredMethods(来源不明)在
java.lang.Class.getDeclaredMethod(来源不明)在
java.io.ObjectStreamClass.getPrivateMethod(来源不明)在
java.io.ObjectStreamClass.access $ 17 $(未知来源)
java.io.ObjectStreamClass $ 2.run(未知来源)
java.io.ObjectStreamClass $ 2.run(未知来源)at
java.security.AccessController。 doBrivileged(Native Method)at
java.io.ObjectStreamClass。(Unknown Source)at
java.io.ObjectStreamClass.lookup(Unknown Source)at
java.io.Obj ectOutputStream.writeObject0(来源不明)处,在
java.io.ObjectOutputStream中的
java.io.ObjectOutputStream.writeSerialData(来源不明)
java.io.ObjectOutputStream.defaultWriteFields(来源不明)。 writeOrdinaryObject(来源不明)维持在
java.io.ObjectOutputStream.writeSerialData
java.io.ObjectOutputStream.writeObject0(来源不明)在
java.io.ObjectOutputStream.defaultWriteFields(来源不明)(来自

java.io.ObjectOutputStream.writeOrdinaryObject(未知来源)的未知来源
java.io.ObjectOutputStream.writeObject0(未知来源)
java.io.ObjectOutputStream.writeObject(未知来源) )
scala.collection.immutable。$ colon $ colon.writeObject(List.scala:379)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at
sun.reflect。 NativeMethodAccessorImpl.invoke(未知来源)at
sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)at at at at
java.lang.ref.Method.invoke(未知来源)
java.io.ObjectStreamClass.invokeWriteObject(未知来源)at
java.io.ObjectOutputStream.writeSerialData(Unknown Source)at at
java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)at
java.io.ObjectOutputStream.writeObject0(Unknown Source)at
java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)at

$ b的java.io.ObjectOutputStream.writeSerialData(未知来源)
java.io.ObjectOutputStream.writeOrdinaryObject(未知来源)
java.io.ObjectOutputStream.writeObject0(未知来源)
java.io.ObjectOutputStream.defaultWriteFields(未知来源)at
java.io.ObjectOutputStream.writeSerialData(未知来源)at
java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)at
java。
的io.ObjectOutputStream.writeObject0(未知来源)java.io.ObjectOutputStream.defaultWriteFields(未知来源) )at
java.io.ObjectOutputStream.writeSerialData(Unknown Source)at
java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)at
java.io.ObjectOutputStream.writeObject0(Unknown Source)at at
java.io.ObjectOutputStream.writeObject(未知来源)at
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:43)
at
org.apache .spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
。在
$ org.apache.spark.util.ClosureCleaner .ensureSerializable(ClosureCleaner.scala:295)
。在
org.apache.spark.util.ClosureCleaner $ .org $ apache $ spark $ util $ ClosureCleaner $$ clean(ClosureCleaner.scala:288)
at
org.apache.spark.util.ClosureCleaner $ .clean(ClosureCleaner.scala:108)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2037)at
org.apache.spark.SparkContext.runJob(SparkContext.scala: 1896)
org.apache.spark.SparkCo ntext.runJob(SparkContext.scala:1911)at
org.apache.spark.rdd.RDD $$ anonfun $ collect $ 1.apply(RDD.scala:893)at
org.apache.spark。 rdd.RDDOperationScope $ .withScope(RDDOperationScope.scala:151)
。在
$ org.apache.spark.rdd.RDDOperationScope .withScope(RDDOperationScope.scala:112)
。在org.apache。 spark.rdd.RDD.withScope(RDD.scala:358)at
org.apache.spark.rdd.RDD.collect(RDD.scala:892)at
org.apache.spark.api。 java.JavaRDDLike $ class.collect(JavaRDDLike.scala:360)
at
org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45)
at com。 chatSparkConnactionTest.JavaDemo.main(JavaDemo.java:37)导致了
。通过:抛出java.lang.ClassNotFoundException:org.apache.spark.sql.Dataset在
java.net.URLClassLoader.findClass(来源不明) at
java.lang.ClassLoader.loadClass(Unknown Source)at
sun.misc.Launcher $ AppClassLoader.loadClass(Unknown Source)at
java.lang.ClassLoader.loadCl屁股(未知来源)... 58更多

16/10/05 20:49:18 INFO CassandraConnector: Connected to Cassandra cluster: Test Cluster Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Unknown Source) at java.lang.Class.getDeclaredMethod(Unknown Source) at java.io.ObjectStreamClass.getPrivateMethod(Unknown Source) at java.io.ObjectStreamClass.access$1700(Unknown Source) at java.io.ObjectStreamClass$2.run(Unknown Source) at java.io.ObjectStreamClass$2.run(Unknown Source) at java.security.AccessController.doPrivileged(Native Method) at java.io.ObjectStreamClass.(Unknown Source) at java.io.ObjectStreamClass.lookup(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.writeObject(Unknown Source) at scala.collection.immutable.$colon$colon.writeObject(List.scala:379) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at java.io.ObjectStreamClass.invokeWriteObject(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source) at java.io.ObjectOutputStream.writeSerialData(Unknown Source) at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source) at java.io.ObjectOutputStream.writeObject0(Unknown Source) at java.io.ObjectOutputStream.writeObject(Unknown Source) at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:43) at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100) at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:295) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:288) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:108) at org.apache.spark.SparkContext.clean(SparkContext.scala:2037) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1896) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1911) at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:893) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:358) at org.apache.spark.rdd.RDD.collect(RDD.scala:892) at org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:360) at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45) at com.chatSparkConnactionTest.JavaDemo.main(JavaDemo.java:37) Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.Dataset at java.net.URLClassLoader.findClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source) at java.lang.ClassLoader.loadClass(Unknown Source) ... 58 more

我更新了pom.xml,但没有解决错误。有人可以帮我解决这个问题吗?

I made my pom.xml updated but that didn't resolve the error. Could anybody help me to resolve this problem?

谢谢!

更新1:
这里是我的构建路径截图:
链接到我的屏幕截图

推荐答案

您收到java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset错误,因为缺少spark-sql依赖项从您的pom.xml文件。

You are getting "java.lang.NoClassDefFoundError: org/apache/spark/sql/Dataset" error because "spark-sql" dependency is missing from your pom.xml file.

如果你想用Spark 2.0.0读取Cassandra表,那么你需要低于最小的依赖性。

If you want to read Cassandra table with Spark 2.0.0 then you need below minimum dependencies.

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.0.0</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql_2.11</artifactId>
    <version>2.0.0</version>
</dependency>
<dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.11</artifactId>
    <version>2.0.0-M3</version>
</dependency>

Spark 2.0.0提供SparkSession和Dataset API。下面是用于读取Cassandra表并打印记录的示例程序。

Spark 2.0.0 provides SparkSession and Dataset API. Below is the sample program for reading Cassandra table and printing the records.

 public class SparkCassandraDatasetApplication {
 public static void main(String[] args) {
 SparkSession spark = SparkSession
          .builder()
          .appName("SparkCassandraDatasetApplication")
          .config("spark.sql.warehouse.dir", "/file:C:/temp")
          .config("spark.cassandra.connection.host", "127.0.0.1")
          .config("spark.cassandra.connection.port", "9042")
          .master("local[2]")
          .getOrCreate();

 //Read data
 Dataset<Row> dataset = spark.read().format("org.apache.spark.sql.cassandra")
        .options(new HashMap<String, String>() {
            {
                put("keyspace", "mykeyspace");
                put("table", "mytable");
            }
        }).load();

   //Print data
   dataset.show();       
   spark.stop();
   }        
}

如果您仍想使用RDD,请使用以下示例程序。

If you still want to use RDD then use below sample program.

public class SparkCassandraRDDApplication {
public static void main(String[] args) {
    SparkConf conf = new SparkConf()
            .setAppName("SparkCassandraRDDApplication")
            .setMaster("local[2]")
            .set("spark.cassandra.connection.host", "127.0.0.1")
            .set("spark.cassandra.connection.port", "9042");

    JavaSparkContext sc = new JavaSparkContext(conf);

    //Read
    JavaRDD<UserData> resultsRDD = javaFunctions(sc).cassandraTable("mykeyspace", "mytable",CassandraJavaUtil.mapRowTo(UserData.class));

    //Print
    resultsRDD.foreach(data -> {
        System.out.println(data.id);
        System.out.println(data.username);
    });

    sc.stop();
  }
}

上面程序中使用的Javabean(UserData)如下所示。

Javabean (UserData) used in above program is like below.

public class UserData implements Serializable{  
  String id;
  String username;     
  public String getId() {
      return id;
  }
  public void setId(String id) {
      this.id = id;
  }
  public String getUsername() {
     return username;
  }
  public void setUsername(String username) {
     this.username = username;
   }    
}

这篇关于Spark和Cassandra Java应用程序:线程“main”中的异常java.lang.NoClassDefFoundError:org / apache / spark / sql / Dataset的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆