java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging [英] java.lang.NoClassDefFoundError: org/apache/spark/internal/Logging

查看:93
本文介绍了java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的Spark Streaming程序收到以下错误:线程主"中的异常java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging我的spark版本是2.1,与集群中运行的版本相同.

My Spark Streaming program received the following error: Exception in thread "main" java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging My version of spark is 2.1, which is the same as the version running in the cluster.

我在Internet上发现的信息提示我,旧版本的org.apache.spark.Logging变成了org.apache.spark.internal.Logging新版本,这阻止了jar包的发现.但是我pom中引入的依赖项是一个新版本.为什么找不到jar包?

The information I found on the Internet prompted me that the old version of org.apache.spark.Logging became org.apache.spark.internal.Logging in the new version, which prevented the jar package from being found. But the dependency introduced in my pom is a new version. Why can't I find the jar package?

    <properties>
            <spark.version>2.1.0</spark.version>
            <scala.version>2.11</scala.version>
    </properties>

    <dependencies>
            <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-core_${scala.version}</artifactId>
                    <version>${spark.version}</version>
            </dependency>
            <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-streaming_${scala.version}</artifactId>
                    <version>${spark.version}</version>
            </dependency>
            <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-sql_${scala.version}</artifactId>
                    <version>${spark.version}</version>
            </dependency>
            <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-hive_${scala.version}</artifactId>
                    <version>${spark.version}</version>
            </dependency>
            <dependency>
                    <groupId>org.apache.spark</groupId>
                    <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
                    <version>2.1.0</version>
            </dependency>
            <dependency>
                    <groupId>org.apache.hadoop</groupId>
                    <artifactId>hadoop-client</artifactId>
                    <version>2.6.0</version>
            </dependency>
            <dependency>
                    <groupId>org.scala-tools</groupId>
                    <artifactId>maven-scala-plugin</artifactId>
                    <version>2.15.2</version>
            </dependency>
            <dependency>
                    <groupId>log4j</groupId>
                    <artifactId>log4j</artifactId>
                    <version>1.2.17</version>
            </dependency>
    </dependencies>
    <build>
            <plugins>
                    <plugin>
                            <groupId>org.scala-tools</groupId>
                            <artifactId>maven-scala-plugin</artifactId>
                            <version>2.15.2</version>
                            <executions>
                                    <execution>
                                            <goals>
                                                    <goal>compile</goal>
                                                    <goal>testCompile</goal>
                                            </goals>
                                    </execution>
                            </executions>
                    </plugin>

                    <plugin>
                            <artifactId>maven-compiler-plugin</artifactId>
                            <version>3.6.0</version>
                            <configuration>
                                    <source>1.8</source>
                                    <target>1.8</target>
                            </configuration>
                    </plugin>
                    <plugin>
                            <groupId>org.apache.maven.plugins</groupId>
                            <artifactId>maven-surefire-plugin</artifactId>
                            <version>2.19</version>
                            <configuration>
                                    <skip>true</skip>
                            </configuration>
                    </plugin>
            </plugins>
    </build>

推荐答案

我多次遇到java.lang.NoClassDefFoundError.我安装了Spark-2.3.1,因此我认为它也应适用于您的情况.

I have encountered java.lang.NoClassDefFoundError several times.I have Spark-2.3.1 installed so I think it should work in your case too.

在我的情况下,java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging应该来自spark-core_2.11-2.3.1.jar,在您的情况下,它应该来自spark-core_2.根据问题中提到的spark和scala版本创建11-2.1.0.jar

In my case, java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging should come from spark-core_2.11-2.3.1.jar and in your case, it should be coming from spark-core_2.11-2.1.0.jar based on your spark and scala version mentioned in the question

当您查看源代码,它使用的是org.slf4j._类.因此,我的建议是在pom中添加此依赖项然后尝试.对于NoClassDefFoundError,总是最好找到产生此错误的jar,然后尝试回溯它.

When you look at the source code, it is using org.slf4j._ classes. So my recommendation would be to add this dependency in your pom and then try. For NoClassDefFoundError, it is always better to find the jar which is generating this error and then try to backtrack it.

下面是假定所有依赖项jar都放在〜/spark/jars位置的情况下,找出导致NoClassDefFound错误的jar的方法.

Below is how you can find out which jar is causing the NoClassDefFound error, assuming you have all the dependency jars in ~/spark/jars location.

for i in `ls ~/spark/jars/*.jar`; 
  do 
  jar tvf $i | grep  org/apache/spark/internal/Logging; 
  echo "************** $i ************** "; 
done

请让我知道这是否解决了您的问题.

Please let me know if this solved your issue.

这篇关于java.lang.NoClassDefFoundError:org/apache/spark/internal/Logging的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆