Hadoop/Eclipse-线程&"main&"中的异常;java.lang.NoClassDefFoundError:org/apache/hadoop/fs/FileSystem [英] Hadoop/Eclipse - Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem

查看:62
本文介绍了Hadoop/Eclipse-线程&"main&"中的异常;java.lang.NoClassDefFoundError:org/apache/hadoop/fs/FileSystem的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试由Manning Publishing的Chuck Lam通过Hadoop in Action运行Hadoop的PutMerge程序.它应该很简单,但是在尝试运行它时遇到了很多问题,而我遇到了一个我无法弄清的错误.同时,我正在运行一个基本的单词计数程序,没有问题.我现在已经花了大约3天的时间.我已经做了所有的研究,我可能会在这一点,我只是失去了.

I'm trying to run the PutMerge program from Hadoop in Action by Chuck Lam from Manning Publishing. It should be pretty simple, but I've had a bunch of problems trying to run it, and I've gotten to this error that I just can't figure out. Meanwhile, I'm running a basic wordcount program with no problem. I've spent about 3 days on this now. I've done all the research I possibly can on this, and I'm just lost.

你有什么主意吗?

程序:

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class PutMerge {

    public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();

        FileSystem hdfs = FileSystem.get(conf);
        FileSystem local = FileSystem.getLocal(conf);

        Path inputDir = new Path(args[0]);
        Path hdfsFile = new Path(args[1]);


        try{
            FileStatus[] inputFiles = local.listStatus(inputDir);
            FSDataOutputStream out = hdfs.create(hdfsFile);

            for (int i=0; i<=inputFiles.length; i++){
                System.out.println(inputFiles[i].getPath().getName());
                FSDataInputStream in = local.open(inputFiles[i].getPath());

                byte buffer[] = new byte[256];
                int bytesRead = 0;

                while( (bytesRead = in.read(buffer)) > 0) {
                    out.write(buffer, 0, bytesRead);
                }

                in.close();

            }

            out.close();

        } catch(IOException e){

            e.printStackTrace();

        }

    }

}

Eclipse的输出错误:

Output Error from Eclipse:

    2015-04-09 19:45:48,321 WARN  util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FileSystem
    at java.lang.ClassLoader.findBootstrapClass(Native Method)
    at java.lang.ClassLoader.findBootstrapClassOrNull(ClassLoader.java:1012)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:413)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:344)
    at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
    at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
    at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
    at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2563)
    at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2574)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2591)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
    at PutMerge.main(PutMerge.java:16)

关于Eclipse:

Eclipse IDE for Java Developers
Version: Luna Service Release 2 (4.4.2)
Build id: 20150219-0600

关于Hadooop:

Hadoop 2.6.0
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r e3496499ecb8d220fba99dc5ed4c99c8f9e33bb1
Compiled by jenkins on 2014-11-13T21:10Z
Compiled with protoc 2.5.0
From source with checksum 18e43357c8f927c0695f1e9522859d6a
This command was run using /usr/local/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0.jar

关于Java:

java version "1.8.0_31"
Java(TM) SE Runtime Environment (build 1.8.0_31-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.31-b07, mixed mode)  

关于我的机器:

Mac OSX 10.9.5

Java构建路径-库中的外部JAR:

Java Build Path - External JARs in Library:

推荐答案

我对Eclipse IDE的经验:

My experience with Eclipse IDE :

我安装ubuntu的基本路径是usr/hadoop/hadoop-2.7.1(让我们说CONF)我添加了两个jar文件,分别来自CONF/share/hadoop/common/lib和CONF/share/hadoop/common.这是java代码(来自Hadoop in Action):

My basic path for ubuntu installation is usr/hadoop/hadoop-2.7.1 (lets' say CONF) I've added two jar files,from CONF/share/hadoop/common/lib and from CONF/share/hadoop/common. And this is the java code (from the book Hadoop in Action) :

import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FSDataInputStream;
import org.apache.hadoop.fs.FSDataOutputStream;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;


public class PutMerge {


public static void main(String[] args) throws IOException {
        Configuration conf = new Configuration();

        conf.set("fs.file.impl",org.apache.hadoop.fs.LocalFileSystem.class.getName());

        org.apache.hadoop.fs.FileSystem hdfs = org.apache.hadoop.fs.FileSystem.get(conf);
        FileSystem local = org.apache.hadoop.fs.FileSystem.getLocal(conf);
        Path inputDir = new Path(args[0]);
        Path hdfsFile = new Path(args[1]);
        try {
            FileStatus[] inputFiles = local.listStatus(inputDir);
            FSDataOutputStream out = hdfs.create(hdfsFile);
            for (int i=0; i<inputFiles.length; i++) {
                System.out.println(inputFiles[i].getPath().getName());
                FSDataInputStream in = local.open(inputFiles[i].getPath());
                byte buffer[] = new byte[256];
                int bytesRead = 0;
                while( (bytesRead = in.read(buffer)) > 0) {
                    out.write(buffer, 0, bytesRead);
                }
                in.close();
            }
            out.close();
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

对我来说,解决方案是从此代码导出.jar文件,这就是我所做的:右键单击PutMerge项目,然后导出(从弹出菜单):

The solution for me was to export the .jar file from this code, and this what I did : Right click on PutMerge project, then export (from the pop-up menu) :

并将jar文件保存在主目录/hduser目录下名为PutMerge的文件夹中

and saved the jar file in a folder named PutMerge on home/hduser directory

在另一个名为input的文件夹中(路径/home/hduser/input),有三个.txt文件作为PutMerge过程的输入:

In another folder named input (path /home/hduser/input) there are three .txt files as input for PutMerge procedure :

现在我们准备从终端会话启动命令:hadoop jar/home/hduser/PutMerge/PutMerge.jar PutMerge/home/hduser/input output4/all

And now we are ready to launch the command from a terminal session : hadoop jar /home/hduser/PutMerge/PutMerge.jar PutMerge /home/hduser/input output4/all

和命令/usr/hadoop/hadoop-2.7.1$ hdfs dfs -cat/output4/all

and the command /usr/hadoop/hadoop-2.7.1$ hdfs dfs -cat /output4/all

将包含三个单个文件的所有文本.

will contain all the text of the three single files.

这篇关于Hadoop/Eclipse-线程&amp;"main&amp;"中的异常;java.lang.NoClassDefFoundError:org/apache/hadoop/fs/FileSystem的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆