处理错误:找到接口org.apache.hadoop.mapreduce.TaskAttemptContext,但预计会有类 [英] Handling Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

查看:1103
本文介绍了处理错误:找到接口org.apache.hadoop.mapreduce.TaskAttemptContext,但预计会有类的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用CDH4并使用新的mapreduce API编写了MapReduce应用程序。我将它编译为hadoop-core-1.0.3.jar,当我在Hadoop集群上运行它时,出现错误:
$ b

错误:Found interface org.apache .hadoop.mapreduce.TaskAttemptContext,但期望类



我提到这个StackOverflow问题,这似乎是在谈论同样的问题。答案建议我们针对Hadoop-core-2.X.jar文件编译代码,但我无法找到类似的东西。



那么我该如何编译它,以便它在CDH4中完美运行。

解决方案

上面问题中发布的链接中的答案要求编译Hadoop 2.0库。顺便说一下,Hadoop 1.0后,而不是使用一个单一的Hadoop核心jar进行编译,将使用两个(或更多)不同的罐子。



我用:
hadoop-common-2.0.2-alpha.jar
hadoop-mapreduce-client-core-2.0.2-alpha.jar



用于编译我的代码,然后它运行良好W / O提供上述错误。


I am using CDH4 and have written a MapReduce application using the new mapreduce API. I have compiled it against hadoop-core-1.0.3.jar and when I run it on my Hadoop cluster I get the error:

Error: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected

I referred to this StackOverflow question which seems to be talking about the same problem. The answer suggests that we compile out code against the Hadoop-core-2.X.jar file, but I am unable to find anything like that.

So how do I compile it so that it runs flawlessly in CDH4.

解决方案

The answer in the link I posted in the question above asked to compile against Hadoop 2.0 library. Incidentally the post Hadoop 1.0, instead of using one single Hadoop Core jar for compilation, two (or maybe more) different jars are to be used.

I used: hadoop-common-2.0.2-alpha.jar hadoop-mapreduce-client-core-2.0.2-alpha.jar

for compiling my code and after that it ran fine w/o giving the aforementioned error.

这篇关于处理错误:找到接口org.apache.hadoop.mapreduce.TaskAttemptContext,但预计会有类的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆