不能用单节点hadoop服务器运行猪 [英] cant run pig with single node hadoop server

查看:121
本文介绍了不能用单节点hadoop服务器运行猪的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我已经用ubuntu建立了一个虚拟机。它将hadoop作为单个节点运行。后来我安装了apache猪。 apache猪在本地模式下运行良好,但总是出现 ERROR 2999:意外的内部错误。无法创建DataStorage



我缺少一些非常明显的东西。请问有人可以帮我解决这个问题吗?



更多详情:
1.我认为hadoop运行正常,因为我可以在python中运行MapReduce作业。
2. pig -x本地运行,如我所料。
3.当我输入 pig 时,会出现以下错误:

 
猪启动前的错误
----------------------------
错误2999:意外的内部错误。无法创建DataStorage

java.lang.RuntimeException:无法在org.apache.pig.backend.hadoop.datastorage.HDataStorage.init创建DataStorage
(HDataStorage.java:75)
at org.apache.pig.backend.hadoop.datastorage.HDataStorage。(HDataStorage.java:58)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
at org.apache.pig.impl.PigContext.connect(PigContext.java: (PigServer.java:226)
at org.apache.pig.PigServer。(PigServer.java:215)
at org.apache.pig.PigServer。(PigServer.java:226)
at org.apache.pig.PigServer。 .pig.tools.grunt.Grunt。(Grunt.java:55)
at org.apache.pig.Main.run(Main.java:452)
at org.apache.pig.Main。 main(Main.java:107)
导致:java.io.IOException:调用localhost / 127.0.0.1:54310本地异常失败:java.io.EOFException $ b $ org.apache.hadoop .ipc。 Client.wrapException(Client.java:775)
在org.apache.hadoop.ipc.Client.call(Client.java:743)
在org.apache.hadoop.ipc.RPC $ Invoker。 invoke(RPC.java:220)
at $ Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
at org .apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
at org.apache.hadoop.hdfs.DFSClient。(DFSClient.java:207)
at org.apache.hadoop。 hdfs.DFSClient。(DFSClient.java:170)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
at org.apache.hadoop.fs.FileSystem.createFileSystem (FileSystem.java:1378)
在org.apache.hadoop.fs.FileSystem.access $ 200(FileSystem.java:66)
在org.apache.hadoop.fs.FileSystem $ Cache.get( $ FileSystem.java:1390)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java: 95)
在org.apache.pig.back end.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
... 9 more
由java.io.EOFException引发的java.io.DataInputStream.readInt处的
( DataInputStream.java:375)
在org.apache.hadoop.ipc.Client $ Connection.receiveResponse(Client.java:501)
在org.apache.hadoop.ipc.Client $ Connection.run( Client.java:446)
======================================== ========================================


解决方案

Link 帮助我了解可能的失败原因。



这是解决我的问题的方法。

1.重新编译没有hadoop的猪。

2.更新PIG_CLASSPATH以从$ HADOOP_HOME / lib中获得所有罐子
3.运行猪。



谢谢。

I have setup a VM with ubuntu. It runs hadoop as a single node. Later I installed apache pig on it. apache pig runs great with local mode, but it always prom ERROR 2999: Unexpected internal error. Failed to create DataStorage

I am missing something very obvious. Can someone help me get this running please?

More details: 1. I assume that hadoop is running fine because, I could run MapReduce jobs in python. 2. pig -x local runs as i expect. 3. when i just type pig it gives me following error

Error before Pig is launched
----------------------------
ERROR 2999: Unexpected internal error. Failed to create DataStorage

java.lang.RuntimeException: Failed to create DataStorage
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.(HDataStorage.java:58)
    at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
    at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
    at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
    at org.apache.pig.PigServer.(PigServer.java:226)
    at org.apache.pig.PigServer.(PigServer.java:215)
    at org.apache.pig.tools.grunt.Grunt.(Grunt.java:55)
    at org.apache.pig.Main.run(Main.java:452)
    at org.apache.pig.Main.main(Main.java:107)
Caused by: java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
    at org.apache.hadoop.ipc.Client.call(Client.java:743)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
    ... 9 more
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:375)
    at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
================================================================================

解决方案

Link helped me understand possible cause of failure.

Here is what fixed my problem.
1. Recompile pig without hadoop.
2. Update PIG_CLASSPATH to have all the jars from $HADOOP_HOME/lib
3. Run pig.

Thanks.

这篇关于不能用单节点hadoop服务器运行猪的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆