Hadoop 2.6.0浏览文件系统Java [英] Hadoop 2.6.0 Browsing filesystem Java

查看:265
本文介绍了Hadoop 2.6.0浏览文件系统Java的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经在CentOS 6.6上安装了一个基本的hadoop集群,并且想要编写一些基本的程序(浏览文件系统,删除/添加文件等),但我很难得到最基本的应用程序工作。 >

当运行一些基本代码以将目录的内容列出到控制台时,我得到以下错误:

 线程main中的异常java.lang.NoSuchMethodError:org.apache.hadoop.ipc.RPC.getProxy(Ljava / lang / Class; JLjava / net / InetSocketAddress; Lorg / apache / hadoop / security / user / group /信息; Lorg / apache / hadoop / conf /配置; Ljavax / net / SocketFactory; ILorg / apache / hadoop / io / retry / RetryPolicy; 
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:135)
at org.apache.hadoop.hdfs.DFSClient。< init>(DFSClient.java:280)
at org.apache.hadoop.hdfs.DFSClient。< init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at mapreducetest.MapreduceTest.App.main(App.java:36)

My pom.xml依赖项

 < dependencies> 
< dependency>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-common< / artifactId>
< version> 2.6.0< / version>
< / dependency>

< dependency>
< groupId> org.apache.hadoop< / groupId>
< artifactId> hadoop-core< / artifactId>
< version> 1.2.1< / version>
< / dependency>
< / dependencies>

代码:

  import java.io.IOException; 
import java.net.URI;
import java.net.URISyntaxException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hdfs.DistributedFileSystem;

public class App
{
public static void main(String [] args)throws IOException,URISyntaxException
{


配置conf = new Configuration();
FileSystem fs = new DistributedFileSystem();
fs.initialize(new URI(hdfs:// localhost:9000 /),conf);


for(FileStatus f:fs.listStatus(new Path(/)))
{
System.out.println(f.getPath .getName());
}

fs.close();

}
}

fs.initialize()。我真的不知道这里的问题。我缺少依赖项?他们是错误的版本?

解决方案

我是通过调用java -jar app.jar .... etc来运行它
已经使用hadoop jar app.jar。



当我正确运行时按照预期工作。


I have installed a basic hadoop cluster on CentOS 6.6 and want to write a few basic programs (browse the filesystem, delete/add files, etc) but I'm struggling to get even the most basic app working.

When running some basic code to list the contents of a directory to the console I get the following error:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.ipc.RPC.getProxy(Ljava/lang/Class;JLjava/net/InetSocketAddress;Lorg/apache/hadoop/security/UserGroupInformation;Lorg/apache/hadoop/conf/Configuration;Ljavax/net/SocketFactory;ILorg/apache/hadoop/io/retry/RetryPolicy;Z)Lorg/apache/hadoop/ipc/VersionedProtocol;
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:135)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:280)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
    at mapreducetest.MapreduceTest.App.main(App.java:36)

My pom.xml dependencies

          <dependencies>    
            <dependency>                                                                                                                                       
                <groupId>org.apache.hadoop</groupId>                                                                                                           
                <artifactId>hadoop-common</artifactId>                                                                                                         
                <version>2.6.0</version>                                                                                            
            </dependency>  

            <dependency>
              <groupId>org.apache.hadoop</groupId>
              <artifactId>hadoop-core</artifactId>
              <version>1.2.1</version>
            </dependency>       
          </dependencies>

The code:

import java.io.IOException;
import java.net.URI;
import java.net.URISyntaxException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hdfs.DistributedFileSystem;

public class App 
{   
    public static void main( String[] args ) throws IOException, URISyntaxException
    {


        Configuration conf = new Configuration();
        FileSystem fs = new DistributedFileSystem();
        fs.initialize(new URI("hdfs://localhost:9000/"), conf);


        for (FileStatus f :fs.listStatus(new Path("/")))
        {
            System.out.println(f.getPath().getName());                  
        }

        fs.close();

    }
}

The error is being thrown after calling fs.initialize(). I'm really not sure what the issue is here. Am I missing dependencies? Are they the wrong version?

解决方案

I was running this by calling "java -jar app.jar .... etc" I should have been using "hadoop jar app.jar".

Worked as intended when I ran it correctly.

这篇关于Hadoop 2.6.0浏览文件系统Java的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆