Eclipse Hadoop 插件问题(调用 localhost/127.0.0.1:50070 )任何人都可以给我解决方案吗? [英] Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 )Can any body give me the solution for this?

查看:26
本文介绍了Eclipse Hadoop 插件问题(调用 localhost/127.0.0.1:50070 )任何人都可以给我解决方案吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

问题:Eclipse Hadoop 插件问题(本地异常调用 localhost/127.0.0.1:50070 失败:java.io.EOFException).任何机构都可以给我解决方案吗?

Issue: Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 failed on local exception: java.io.EOFException). Can any body give me the solution for this?

我正在学习 Cloudera 培训教程.其中使用 Eclipse(Helios)3.6 和 Hadoop.0.20.2-cdh3u2 版本.

I am having Cloudera training tutorial. Which uses Eclipse(Helios)3.6 and Hadoop.0.20.2-cdh3u2 versions.

我已经下载了 hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar 并将其复制到/home/training/eclipse/plugins/文件夹中.

I had downloaded the hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar and copied the same in /home/training/eclipse/plugins/ folder.

Run --> Eclipse --> gone to File (which is in Menu bar) --> New --> other

从其他选择的 MapReduce 项目.我选择了指定 Hadoop 库位置.并将位置指定为/usr/lib/hadoop".在这个位置,我有以下文件.

From other Selected the MapReduce Project. I had selected the Specify Hadoop library location. And given location as "/usr/lib/hadoop". In this location I have below files.

bin                                hadoop-examples-0.20.2-cdh3u2.jar
build.xml                          hadoop-examples.jar
CHANGES.txt                        hadoop-test-0.20.2-cdh3u2.jar
conf                               hadoop-test.jar
contrib                            hadoop-tools-0.20.2-cdh3u2.jar
example-confs                      hadoop-tools.jar
hadoop-0.20.2-cdh3u2-ant.jar       ivy
hadoop-0.20.2-cdh3u2-core.jar      ivy.xml
hadoop-0.20.2-cdh3u2-examples.jar  lib
hadoop-0.20.2-cdh3u2-test.jar      LICENSE.txt
hadoop-0.20.2-cdh3u2-tools.jar     logs
hadoop-ant-0.20.2-cdh3u2.jar       NOTICE.txt
hadoop-ant.jar                     pids
hadoop-core-0.20.2-cdh3u2.jar      README.txt
hadoop-core.jar                    webapps

并将 Mpareduce 项目指定为myhadoop"并单击完成按钮.我在 DFS Locations 按钮上得到了 Mapreduce 按钮,但没有得到它的雇佣关系.

and given the Mpareduce Project as "myhadoop" and clicked the finish button. I got the Mapreduce button at DFS Locations button and but not its hirearchy.

去检查我的 dfs 和 mapred 端口.

Gone and checked my dfs and mapred ports.

我的 core-site.xml 是

My core-site.xml is

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:8020</value>
  </property>

我的 mapred-site.xml

my mapred-site.xml

<configuration>
  <property>
    <name>mapred.job.tracker</name>
    <value>localhost:8021</value>
  </property>

在 Map ReTo Define Hadoop Location in eclipse 中我给出如下.

At Map ReTo Define Hadoop Location in eclipse I had given as below.

Map/Reduce Master
Host: localhost
port 50021

DFS Master:
Host :localhost
Port:50020

同时我选择使用M/R主机.

at the same I am select use M/R host.

我运行了 cloudera 的示例 wordcount 程序,但它给了我以下问题.请给我解决方案,我正在尝试 2 天.,....

I had ran cloudera's example wordcount program but it is giving me below issue. Please give me solution I am trying from 2 days.,....

Exception in thread "main" java.io.IOException: Call to localhost/127.0.0.1:50070 failed on local exception: java.io.EOFException
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
    at org.apache.hadoop.ipc.Client.call(Client.java:1110)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:212)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:183)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)
    at WordCount.main(WordCount.java:65)
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:375)
    at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:815)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:724)

推荐答案

尝试将配置中的机器名称从 localhost 更改为主机名

Try changing the name of your machine in the configuration from localhost to the hostname

这篇关于Eclipse Hadoop 插件问题(调用 localhost/127.0.0.1:50070 )任何人都可以给我解决方案吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆