Eclipse Hadoop插件问题(Call to localhost / 127.0.0.1:50070)任何身体都能给我解决方案吗? [英] Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 )Can any body give me the solution for this?

查看:279
本文介绍了Eclipse Hadoop插件问题(Call to localhost / 127.0.0.1:50070)任何身体都能给我解决方案吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

问题:Eclipse Hadoop插件问题(调用localhost / 127.0.0.1:50070失败本地异常:java.io.EOFException)。任何身体都能给我解决这个问题吗?我有Cloudera培训教程。它使用Eclipse(Helios)3.6和Hadoop.0.20.2-cdh3u2版本。



我已经下载了hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar和在/ home / training / eclipse / plugins /文件夹中复制相同。

 运行 - > Eclipse  - >转到File(在菜单栏中) - >新 - >其他

从其他选择MapReduce项目。我选择了指定Hadoop库位置。并将位置指定为/ usr / lib / hadoop。
在这个位置我有以下文件。

  bin hadoop-examples-0.20.2-cdh3u2.jar 
build.xml hadoop-examples.jar
CHANGES.txt hadoop-test-0.20.2-cdh3u2.jar
conf hadoop-test.jar
contrib hadoop-tools-0.20.2 -cdh3u2.jar
example-confs hadoop-tools.jar
hadoop-0.20.2-cdh3u2-ant.jar ivy
hadoop-0.20.2-cdh3u2-core.jar ivy.xml
hadoop-0.20.2-cdh3u2-examples.jar lib
hadoop-0.20.2-cdh3u2-test.jar LICENSE.txt
hadoop-0.20.2-cdh3u2-tools.jar logs
hadoop-ant-0.20.2-cdh3u2.jar NOTICE.txt
hadoop-ant.jar pids
hadoop-core-0.20.2-cdh3u2.jar README.txt
hadoop-core.jar webapps

,并将Mpareduce项目命名为myhadoop,并点击完成按钮。
我在DFS位置按钮中找到了Mapreduce按钮,但不是其父级。



发送并检查我的dfs和映射端口。



我的core-site.xml是

 < configuration> 
< property>
< name> fs.default.name< / name>
< value> hdfs:// localhost:8020< / value>
< / property>

我的mapred-site.xml

 <结构> 
< property>
< name> mapred.job.tracker< / name>
< value> localhost:8021< / value>
< / property>

在地图上ReTo定义Hadoop在eclipse中的位置我已经给出如下。

  Map / Reduce Master 
主机:localhost
端口50021

DFS Master:
主机:localhost
端口:50020

在同一个我选择使用M / R主机。



我已经运行cloudera的示例wordcount程序,但它给我以下问题。
请给我解决方案,我正在尝试从2天。,....

 线程主 java.io.IOException:调用localhost / 127.0.0.1:50070在本地异常失败:java.io.EOFException 
在org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
在org.apache.hadoop.ipc.Client.call(Client.java:1110)
在org.apache.hadoop.ipc.RPC $ Invoker.invoke(RPC.java:226)
在$ Proxy0.getProtocolVersion(未知源)
在org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
在org.apache.hadoop.ipc.RPC.getProxy(RPC .java:384)
在org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
在org.apache.hadoop.hdfs.DFSClient。< init>(DFSClient。 java:213)
在org.apache.hadoop.hdfs.DFSClient。< init>(DFSClient.java:180)
在org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java :89)
在org.apache.hadoop.fs.FileSystem.createFileSystem(文件System.java:1514)
在org.apache.hadoop.fs.FileSystem.access $ 200(FileSystem.java:67)
在org.apache.hadoop.fs.FileSystem $ Cache.getInternal(FileSystem .java:1548)
在org.apache.hadoop.fs.FileSystem $ Cache.get(FileSystem.java:1530)
在org.apache.hadoop.fs.FileSystem.get(FileSystem.java :228)
在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
在org.apache.hadoop.fs.FileSystem.get(FileSystem.java:212)
在org.apache.hadoop.fs.Path.getFileSystem(Path.java:183)
在org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)
在WordCount.main(WordCount.java:65)
导致:java.io.EOFException
在java.io.DataInputStream.readInt(DataInputStream.java:375)
在org.apache.hadoop.ipc.Client $ Connection.receiveResponse(Client.java:815)
在org.apache.hadoop.ipc.Client $ Connection.run(Client.java:724)


解决方

尝试从本地主机的配置更改计算机的名称将主机名


Issue: Eclipse Hadoop plugin issue(Call to localhost/127.0.0.1:50070 failed on local exception: java.io.EOFException). Can any body give me the solution for this?

I am having Cloudera training tutorial. Which uses Eclipse(Helios)3.6 and Hadoop.0.20.2-cdh3u2 versions.

I had downloaded the hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar and copied the same in /home/training/eclipse/plugins/ folder.

Run --> Eclipse --> gone to File (which is in Menu bar) --> New --> other

From other Selected the MapReduce Project. I had selected the Specify Hadoop library location. And given location as "/usr/lib/hadoop". In this location I have below files.

bin                                hadoop-examples-0.20.2-cdh3u2.jar
build.xml                          hadoop-examples.jar
CHANGES.txt                        hadoop-test-0.20.2-cdh3u2.jar
conf                               hadoop-test.jar
contrib                            hadoop-tools-0.20.2-cdh3u2.jar
example-confs                      hadoop-tools.jar
hadoop-0.20.2-cdh3u2-ant.jar       ivy
hadoop-0.20.2-cdh3u2-core.jar      ivy.xml
hadoop-0.20.2-cdh3u2-examples.jar  lib
hadoop-0.20.2-cdh3u2-test.jar      LICENSE.txt
hadoop-0.20.2-cdh3u2-tools.jar     logs
hadoop-ant-0.20.2-cdh3u2.jar       NOTICE.txt
hadoop-ant.jar                     pids
hadoop-core-0.20.2-cdh3u2.jar      README.txt
hadoop-core.jar                    webapps

and given the Mpareduce Project as "myhadoop" and clicked the finish button. I got the Mapreduce button at DFS Locations button and but not its hirearchy.

Gone and checked my dfs and mapred ports.

My core-site.xml is

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:8020</value>
  </property>

my mapred-site.xml

<configuration>
  <property>
    <name>mapred.job.tracker</name>
    <value>localhost:8021</value>
  </property>

At Map ReTo Define Hadoop Location in eclipse I had given as below.

Map/Reduce Master
Host: localhost
port 50021

DFS Master:
Host :localhost
Port:50020

at the same I am select use M/R host.

I had ran cloudera's example wordcount program but it is giving me below issue. Please give me solution I am trying from 2 days.,....

Exception in thread "main" java.io.IOException: Call to localhost/127.0.0.1:50070 failed on local exception: java.io.EOFException
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
    at org.apache.hadoop.ipc.Client.call(Client.java:1110)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:212)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:183)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)
    at WordCount.main(WordCount.java:65)
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:375)
    at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:815)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:724)

解决方案

Try changing the name of your machine in the configuration from localhost to the hostname

这篇关于Eclipse Hadoop插件问题(Call to localhost / 127.0.0.1:50070)任何身体都能给我解决方案吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆