java.lang.Exception:java.lang.NoClassDefFoundError:org / apache / lucene / util / OpenBitSet [英] java.lang.Exception: java.lang.NoClassDefFoundError: org/apache/lucene/util/OpenBitSet

查看:236
本文介绍了java.lang.Exception:java.lang.NoClassDefFoundError:org / apache / lucene / util / OpenBitSet的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在使用maven的netbeans中,我添加了org.apache.lucene lucene-core 4.2.0的第三方依赖项,因为较新的核心版本不包含OpenBitSet类。
这里是pom:

In netbeans with maven i have added third party dependency of org.apache.lucene lucene-core 4.2.0 because newer core versions do not contain OpenBitSet class. Here is the pom:

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.hadoop</groupId>
<artifactId>DuccProject</artifactId>
<version>2.7.3</version>
<packaging>jar</packaging>
<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-hdfs</artifactId>
        <version>2.0.0-cdh4.0.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-auth</artifactId>
        <version>2.0.0-cdh4.0.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-common</artifactId>
        <version>2.0.0-cdh4.0.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-core</artifactId>
        <version>2.0.0-mr1-cdh4.0.1</version>
    </dependency>
    <dependency>
        <groupId>junit</groupId>
        <artifactId>junit</artifactId>
        <version>4.10</version>
        <scope>test</scope>
    </dependency>
    <dependency>
        <groupId>org.apache.lucene</groupId>
        <artifactId>lucene-core</artifactId>
        <version>4.2.0</version>
    </dependency>
    <dependency>
        <groupId>it.unimi.dsi</groupId>
        <artifactId>fastutil</artifactId>
        <version>7.0.12</version>
    </dependency>
</dependencies>
<build>
    <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.1</version>
            <configuration>
                <source>1.8</source>
                <target>1.8</target>
            </configuration>
        </plugin>
    </plugins>
</build>

我可以使用所有的通过netbeans包含的类。
我已经在包含.pom文件的目录内建立了带有mvn clean install的jar。
但是当我从cli运行jar时:

I can use all of the classes contained in package via netbeans. I have built the jar with "mvn clean install" inside the directory which contains .pom file. But when I run the jar from cli:

bin/hadoop jar ~/NetBeansProjects/DuccProject/DuccProject/target/DuccProject-2.7.3.jar org.apache.hadoop.duccproject.Ducc /hdfs/path/to/input /hdfs/path/to/output

其中org.apache.hadoop.duccproject.Ducc是我的主类的路径,
我面临以下错误:

where org.apache.hadoop.duccproject.Ducc is the path to my main class, I am facing the following error:

java.lang.Exception: java.lang.NoClassDefFoundError: org/apache/lucene/util/OpenBitSet
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.NoClassDefFoundError: org/apache/lucene/util/OpenBitSet
at org.apache.hadoop.columns.ColumnCombinationBitset.<init>(ColumnCombinationBitset.java:33)
at org.apache.hadoop.duccproject.ParserReducer.reduce(ParserReducer.java:53)
at org.apache.hadoop.duccproject.ParserReducer.reduce(ParserReducer.java:24)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:171)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:627)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.lucene.util.OpenBitSet
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
16/09/04 17:41:53 INFO mapreduce.Job:  map 100% reduce 0%


推荐答案

错误表明应用程序无法找到所需的类,在运行期间,还需要使用hadoop命令选项提供lucene-core 4.2.0 jar的路径,如下所示:

Error indicate that application was unable to find the required class , During run-time, you also need to provide the path of the lucene-core 4.2.0 jar using hadoop command options as follows

-libjars <comma seperated list of jars>

您还可以看到其他 hadoop命令选项

这篇关于java.lang.Exception:java.lang.NoClassDefFoundError:org / apache / lucene / util / OpenBitSet的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆