运行hbase MR作业时,我的cdh5.2集群出现FileNotFoundException [英] My cdh5.2 cluster get FileNotFoundException when running hbase MR jobs

查看:326
本文介绍了运行hbase MR作业时,我的cdh5.2集群出现FileNotFoundException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



例如,我将hbase类路径添加到hadoop类路径中:

  vi /etc/hadoop/conf/hadoop-env.sh 

添加行:

  export HADOOP_CLASSPATH =/ usr / lib / hbase / bin / hbase classpath:$ HADOOP_CLASSPATH

当我运行时:
hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcountermytable



我得到以下异常:

  14/12/09 03:44:02 WARN security.UserGroupInformation:PriviledgedActionException as:root (auth:SIMPLE)cause:java.io.FileNotFoundException:文件不存在:hdfs://clustername/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar 
异常在线程mainjava.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(本地方法)
at sun.reflect。 NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
在java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:54)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect。 NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
在java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
引起:java.io.FileNotFoundException:文件不存在:hdfs:// clusterName / usr / lib /hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
在org.apache.hadoop.hdfs.DistributedFileSystem $ 17.doCall(DistributedFileSystem.java:1083)
在org.apache .hadoop.hdfs.Dis tributedFileSystem $ 17.doCall(DistributedFileSystem.java:1075)
在org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
在org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus (DistributedFileSystem.java:1075)在org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)

在org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus (ClientDistributedCacheManager.java:224)在org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)

在org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities在org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(ClientDistributedCacheManager.java:57)
(JobSubmitter.java:265)在org.apache.hadoop.mapreduce.Jo
bSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
在org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
在org.apache.hadoop.mapreduce.Job $ 10.run (Job.java:1295)
at org.apache.hadoop.mapreduce.Job $ 10.run(Job.java:1292)
at java.security.AccessController.doPrivileged(Native Method)
在javax.security.auth.Subject.doAs(Subject.java:415)
在org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
在org.apache。 hadoop.mapreduce.Job.submit(Job.java:1292)
处org.apache.hadoop.hbase org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
。 mapreduce.RowCounter.main(RowCounter.java:191)
在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
在sun.reflect.DelegatingMethodAccessorImp l.invoke(DelegatingMethodAccessorImpl.java:43)
在java.lang.reflect.Method.invoke(Method.java:606)
在org.apache.hadoop.util.ProgramDriver $ ProgramDescription.invoke( ProgramDriver.java:72)在org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145

在org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java: 153)


解决方案

所以,问题是环境问题:
当我将下面的jar添加到/ usr / lib / hadoop / lib中时。所有工作都很好

  hbase-client-0.98.6-cdh5.2.1.jar 
hbase-common-0.98。 6-cdh5.2.1.jar
hbase-protocol-0.98.6-cdh5.2.1.jar
hbase-server-0.98.6-cdh5.2.1.jar
hbase-prefix-tree -0.98.6-cdh5.2.1.jar
hadoop-core-2.5.0-mr1-cdh5.2.1.jar
htrace-core-2.04.jar



我的机器在其中包含以下RPM:

 >> rpm -qa | grep cdh 
zookeeper-3.4.5 + cdh5.2.1 + 84-1.cdh5.2.1.p0.13.el6.x86_64
hadoop-2.5.0 + cdh5.2.1 + 578-1.cdh5 .2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-2.5.0 + cdh5.2.1 + 578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-regionserver -0.98.6 + cdh5.2.1 + 64-1.cdh5.2.1.p0.9.el6.x86_64
cloudera-cdh-5-0.x86_64
bigtop-utils-0.7.0 + cdh5 .2.1 + 0-1.cdh5.2.1.p0.13.el6.noarch
bigtop-jsvc-0.6.0 + cdh5.2.1 + 578-1.cdh5.2.1.p0.13.el6.x86_64
parquet-1.5.0 + cdh5.2.1 + 38-1.cdh5.2.1.p0.12.el6.noarch
hadoop-hdfs-2.5.0 + cdh5.2.1 + 578-1.cdh5。 2.1.p0.14.el6.x86_64
hadoop-mapreduce-2.5.0 + cdh5.2.1 + 578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce- tasktracker-2.5.0 + cdh5.2.1 + 578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-0.98.6 + cdh5.2.1 + 64-1.cdh5.2.1.p0.9 .el6.x86_64
avro-libs-1.7.6 + cdh5.2.1 + 69-1.cdh5.2.1.p0.13.el6.noarch
parquet-format-2.1.0 + cdh5.2.1 + 6-1.cdh5.2.1.p0.14.el6.noarch
hadoop-yarn-2.5.0 + cdh5.2.1 + 578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-hdfs-datanode-2.5.0 + cdh5.2.1 + 578-1 .cdh5.2.1.p0.14.el6.x86_64

我仍然想知道缺少哪个rpm。 / p>

My cdh5.2 cluster has a problem to run hbase MR jobs.

For example, I added the hbase classpath into the hadoop classpath:

vi /etc/hadoop/conf/hadoop-env.sh

add the line:

export HADOOP_CLASSPATH="/usr/lib/hbase/bin/hbase classpath:$HADOOP_CLASSPATH"

And when I am running: hadoop jar /usr/lib/hbase/hbase-server-0.98.6-cdh5.2.1.jar rowcounter "mytable"

I get the following exception:

14/12/09 03:44:02 WARN security.UserGroupInformation: PriviledgedActionException as:root (auth:SIMPLE) cause:java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
Exception in thread "main" java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.hbase.mapreduce.Driver.main(Driver.java:54)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.io.FileNotFoundException: File does not exist: hdfs://clusterName/usr/lib/hbase/lib/hbase-client-0.98.6-cdh5.2.1.jar
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1083)
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1075)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1075)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:288)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.getFileStatus(ClientDistributedCacheManager.java:224)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestamps(ClientDistributedCacheManager.java:93)
        at org.apache.hadoop.mapreduce.filecache.ClientDistributedCacheManager.determineTimestampsAndCacheVisibilities(ClientDistributedCacheManager.java:57)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:265)
        at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:301)
        at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313)
        at org.apache.hadoop.hbase.mapreduce.RowCounter.main(RowCounter.java:191)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72)
        at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:145)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:153)

解决方案

So, The problem was environment issue: When I added the below jars into /usr/lib/hadoop/lib. all worked fine

hbase-client-0.98.6-cdh5.2.1.jar
hbase-common-0.98.6-cdh5.2.1.jar
hbase-protocol-0.98.6-cdh5.2.1.jar
hbase-server-0.98.6-cdh5.2.1.jar
hbase-prefix-tree-0.98.6-cdh5.2.1.jar
hadoop-core-2.5.0-mr1-cdh5.2.1.jar
htrace-core-2.04.jar

My machine has the following rpms in it:

>> rpm -qa | grep cdh
zookeeper-3.4.5+cdh5.2.1+84-1.cdh5.2.1.p0.13.el6.x86_64
hadoop-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-regionserver-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
cloudera-cdh-5-0.x86_64
bigtop-utils-0.7.0+cdh5.2.1+0-1.cdh5.2.1.p0.13.el6.noarch
bigtop-jsvc-0.6.0+cdh5.2.1+578-1.cdh5.2.1.p0.13.el6.x86_64
parquet-1.5.0+cdh5.2.1+38-1.cdh5.2.1.p0.12.el6.noarch
hadoop-hdfs-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-mapreduce-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-0.20-mapreduce-tasktracker-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hbase-0.98.6+cdh5.2.1+64-1.cdh5.2.1.p0.9.el6.x86_64
avro-libs-1.7.6+cdh5.2.1+69-1.cdh5.2.1.p0.13.el6.noarch
parquet-format-2.1.0+cdh5.2.1+6-1.cdh5.2.1.p0.14.el6.noarch
hadoop-yarn-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64
hadoop-hdfs-datanode-2.5.0+cdh5.2.1+578-1.cdh5.2.1.p0.14.el6.x86_64

I still wonder which rpm is missing.

这篇关于运行hbase MR作业时,我的cdh5.2集群出现FileNotFoundException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆