构建Hadoop的使用1.2.1核心蚂蚁罐子 - 失败 [英] Building Hadoop 1.2.1 core jar using ant - Failed

查看:329
本文介绍了构建Hadoop的使用1.2.1核心蚂蚁罐子 - 失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要进行一些修改地图类和Hadoop的减少类,所以,我一直在试图用蚂蚁从源文件编译的Hadoop 1.2.1 jar文件,但我总是得到以下错误:

I need to make some modifications to the map class and reduce class from Hadoop, so, I have been trying to compile Hadoop 1.2.1 jar files using ant from the source files, but I always get the following error:

Buildfile: build.xml

clover.setup:

clover.info:
 [echo] 
 [echo]      Clover not found. Code coverage reports disabled.
 [echo]   

clover:

ivy-download:
  [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0   /ivy-2.1.0.jar
  [get] To: /home/user/Downloads/hadoop-1.2.1/ivy/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-init-dirs:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = /home/user/Downloads/hadoop-1.2.1/ivy/ivysettings.xml

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file'  instead
[ivy:cachepath] :: loading settings :: file = /home/user/Downloads/hadoop-1.2.1/ivy/ivysettings.xml

init:
[touch] Creating /tmp/null1102494190
[delete] Deleting: /tmp/null1102494190
 [exec] svn: E155007: '/home/user/Downloads/hadoop-1.2.1' is not a working copy
 [exec] svn: E155007: '/home/user/Downloads/hadoop-1.2.1' is not a working copy

record-parser:

compile-rcc-compiler:

compile-core-classes:

compile-hdfs-classes:
[javac] Compiling 2 source files to /home/user/Downloads/hadoop-1.2.1/build/classes
[javac] warning: [options] bootstrap class path not set in conjunction with -source 1.6
[javac] 1 warning

compile-mapred-classes:
Trying to override old definition of task jsp-compile

create-native-configure:
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:48: error: possibly undefined macro: AC_PROG_LIBTOOL
 [exec]       If this token and others are legitimate, please use m4_pattern_allow.
 [exec]       See the Autoconf documentation.
 [exec] autoreconf: /usr/bin/autoconf failed with exit status: 1

BUILD FAILED
/home/user/Downloads/hadoop-1.2.1/build.xml:634: exec returned: 1

有谁知道可能是什么问题呢?或者知道如何创建一个使用蚂蚁的jar文件?

Does anybody know what could be the problem? or knows how to create the jar files using ant?

感谢

推荐答案

该错误是,这是缺少库。

The error was that it was missing a library.

只是通过安装固定的libtool

Fixed just by installing libtool.

sudo apt-get install libtool

现在它将成功构建项目。

Now it will build successfully the project.

来源:[链接] <一个href=\"http://desk.stinkpot.org:8080/tricks/index.php/2007/05/fixing-error-undefined-macro-ac_prog_libtool/\" rel=\"nofollow\">http://desk.stinkpot.org:8080/tricks/index.php/2007/05/fixing-error-undefined-macro-ac_prog_libtool/

这篇关于构建Hadoop的使用1.2.1核心蚂蚁罐子 - 失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆