使用 ant 构建 Hadoop 1.2.1 核心 jar - 失败 [英] Building Hadoop 1.2.1 core jar using ant - Failed

查看:26
本文介绍了使用 ant 构建 Hadoop 1.2.1 核心 jar - 失败的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我需要对 Hadoop 中的 map 类和 reduce 类进行一些修改,因此,我一直在尝试使用源文件中的 ant 来编译 Hadoop 1.2.1 jar 文件,但总是出现以下错误:

I need to make some modifications to the map class and reduce class from Hadoop, so, I have been trying to compile Hadoop 1.2.1 jar files using ant from the source files, but I always get the following error:

Buildfile: build.xml

clover.setup:

clover.info:
 [echo] 
 [echo]      Clover not found. Code coverage reports disabled.
 [echo]   

clover:

ivy-download:
  [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0   /ivy-2.1.0.jar
  [get] To: /home/user/Downloads/hadoop-1.2.1/ivy/ivy-2.1.0.jar
  [get] Not modified - so not downloaded

ivy-init-dirs:

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: Ivy 2.1.0 - 20090925235825 :: http://ant.apache.org/ivy/ ::
[ivy:configure] :: loading settings :: file = /home/user/Downloads/hadoop-1.2.1/ivy/ivysettings.xml

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file'  instead
[ivy:cachepath] :: loading settings :: file = /home/user/Downloads/hadoop-1.2.1/ivy/ivysettings.xml

init:
[touch] Creating /tmp/null1102494190
[delete] Deleting: /tmp/null1102494190
 [exec] svn: E155007: '/home/user/Downloads/hadoop-1.2.1' is not a working copy
 [exec] svn: E155007: '/home/user/Downloads/hadoop-1.2.1' is not a working copy

record-parser:

compile-rcc-compiler:

compile-core-classes:

compile-hdfs-classes:
[javac] Compiling 2 source files to /home/user/Downloads/hadoop-1.2.1/build/classes
[javac] warning: [options] bootstrap class path not set in conjunction with -source 1.6
[javac] 1 warning

compile-mapred-classes:
Trying to override old definition of task jsp-compile

create-native-configure:
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:42: warning: AC_COMPILE_IFELSE was called before AC_USE_SYSTEM_EXTENSIONS
 [exec] ../../lib/autoconf/specific.m4:314: AC_GNU_SOURCE is expanded from...
 [exec] configure.ac:42: the top level
 [exec] configure.ac:48: error: possibly undefined macro: AC_PROG_LIBTOOL
 [exec]       If this token and others are legitimate, please use m4_pattern_allow.
 [exec]       See the Autoconf documentation.
 [exec] autoreconf: /usr/bin/autoconf failed with exit status: 1

BUILD FAILED
/home/user/Downloads/hadoop-1.2.1/build.xml:634: exec returned: 1

有人知道可能是什么问题吗?或者知道如何使用 ant 创建 jar 文件?

Does anybody know what could be the problem? or knows how to create the jar files using ant?

谢谢

推荐答案

错误是缺少一个库.

仅通过安装 libtool 即可修复.

Fixed just by installing libtool.

sudo apt-get install libtool

现在它将成功构建项目.

Now it will build successfully the project.

来源:[链接] http://desk.stinkpot.org:8080/tricks/index.php/2007/05/fixing-error-undefined-macro-ac_prog_libtool/

这篇关于使用 ant 构建 Hadoop 1.2.1 核心 jar - 失败的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆