Hadoop的2.6.0版本在Windows 8.1中失败 - 蚂蚁BuildException [英] Hadoop 2.6.0 build on Windows 8.1 fails - Ant BuildException

查看:965
本文介绍了Hadoop的2.6.0版本在Windows 8.1中失败 - 蚂蚁BuildException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

您好我试图在Windows 8.1中建立的Hadoop 2.6.0。不幸的是没有运气这么远。<​​/ P>

我已经安装了:


  • jdk1.7.0_71(添加变量JAVA_HOME值为C:\\ Program Files文件\\的Java \\ jdk1.7.0_71的用户变量)

  • cygwin64(加入其安装目录为值D:\\ cygwin64 \\ bin添加在系统变量PATH变量)

  • 的Maven 3.2.5(加入其安装目录为值D:在系统变量\\行家\\ bin添加到PATH变量)

  • 协议缓冲区2.5(加入其安装目录为值D:\\ protobuf的下系统变量PATH变量)

  • 的Visual Studio 2010

在Visual Studio 2010的命令提示符(开始作为管理员)我已经改变了驱动器D:和所用的Hadoop的src的文件夹为起点(D:\\ HDP)

我重新设置JAVA_HOME短符号和不设置平台

 设置JAVA_HOME = C:\\ PROGRA〜1 \\的Java \\ jdk1.7.0_71
设置平台= 64

后来,我尝试使用以下Maven命令来构建Hadoop的:

  MVN -e -X包-Pdist -DskipTests -Dtar

大厦失败后,在阿帕奇Hadoop的共同说明以下内容:

项目Hadoop的共同protoc(编译protoc):

  [错误]未能执行目标org.apache.hadoop:Hadoop的Maven的插件:2.6.0 org.apache .maven.plugin.MojoExecutionException:protoc失败 - &GT; [求助1]

我稍微文件ProtocMojo.java改变下D:\\ HDP \\ Hadoop的Maven的插件的\\ src \\主\\ java的\\组织\\阿帕奇\\ Hadoop的\\行家\\插件\\ protoc插入AFER以下行56在文件:

  protocCommand =D:\\\\ \\\\的protobuf protoc.exe

这有助于进一步打造为阿帕奇Hadoop的HDFS里再次失败,说明如下:

  [INFO]执行任务
建立目标(S)`主序是[主]
完整的构建顺序[为主,]
主要:
    [MKDIR]跳过D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标\\本地,因为它已经存在。
     [执行]当前的操作系统是Windows 8.1
     [执行]执行与参数'cmake的':
     [执行] D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS / src目录/'
     [执行]'-DGENERATED_JAVAH = D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标/本地/ JAVAH
     [执行]'-DJVM_ARCH_DATA_MODEL = 64'
     [执行]'-DREQUIRE_LIBWEBHDFS =假'
     [执行]'-DREQUIRE_FUSE =假'
     [执行]-G
     [执行]'的Visual Studio 10 Win64的'
     [执行]
     [执行]周围的可执行文件和参数的'字符
     [执行]不是命令的一部分。
执行:Java13CommandLauncher:执行'cmake的'带参数:
'D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS / src目录/'
-DGENERATED_JAVAH = D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标/本地/ JAVAH
-DJVM_ARCH_DATA_MODEL = 64'
-DREQUIRE_LIBWEBHDFS =假'
-DREQUIRE_FUSE =假'
'-G'
Visual Studio的10 Win64的'周围的可执行文件和参数的字符
不是命令的一部分。
     [执行] CMake的错误:无法创建名为生成的Visual Studio 10 Win64平台
[INFO] ----------------------------------------------- -------------------------
[INFO]反应堆摘要:
[信息]
[INFO]的Apache Hadoop的主要.................................成功[0.906秒]
[INFO]的Apache Hadoop项目POM ..........................成功[0.719秒]
[INFO]的Apache Hadoop的注解..........................成功[1.469秒]
[INFO]的Apache的Hadoop组件...........................成功[0.265秒]
[INFO]的Apache Hadoop项目DIST POM .....................成功[1.766秒]
[INFO]的Apache Hadoop的Maven插件的........................成功[5.516秒]
[INFO]的Apache Hadoop的MiniKDC ..............................成功[1.431秒]
[INFO]的Apache Hadoop的验证.................................成功[2.119秒]
[INFO]的Apache Hadoop的验证例子........................成功[1.969秒]
[INFO]的Apache Hadoop的常见...............................成功[01:11分]
[INFO]的Apache Hadoop的NFS ..................................成功[4.087秒]
[INFO]的Apache Hadoop的KMS ..................................成功[11.742秒]
[INFO]的Apache Hadoop的共同的项目成功....................... [0.110秒]
[INFO]的Apache Hadoop的HDFS .................................失败[11.782秒]
[INFO]的Apache Hadoop的HttpFS ............................... SKIPPED
[INFO]的Apache Hadoop的HDFS会计杂志.............. SKIPPED
[INFO]的Apache Hadoop的HDFS-NFS ............................. SKIPPED
[INFO]的Apache Hadoop的HDFS项目......................... SKIPPED
[INFO] Hadoop的纱........................................ SKIPPED
[INFO] Hadoop的纱-API .................................... SKIPPED
[INFO] Hadoop的纱共同................................. SKIPPED
[INFO] Hadoop的纱服务器................................. SKIPPED
[INFO] Hadoop的纱服务器共同.......................... SKIPPED
[INFO] Hadoop的纱服务器节点管理器..................... SKIPPED
[INFO] Hadoop的纱服务器的Web代理....................... SKIPPED
[INFO] Hadoop的纱服务器applicationhistoryservice ....... SKIPPED
[INFO] Hadoop的纱服务器的ResourceManager ................. SKIPPED
[INFO] Hadoop的纱服务器测试........................... SKIPPED
[INFO] Hadoop的纱线客户端................................. SKIPPED
[INFO] Hadoop的纱线的应用程序........................... SKIPPED
[INFO] Hadoop的纱线的应用程序,distributedshell .......... SKIPPED
[INFO] Hadoop的纱线的应用程序 - 非托管上午发射..... SKIPPED
[INFO] Hadoop的纱现场................................... SKIPPED
[INFO] Hadoop的纱注册表............................... SKIPPED
[INFO] Hadoop的纱线项目................................ SKIPPED
[INFO]的Hadoop-MA preduce客户端............................ SKIPPED
[INFO]的Hadoop-MA preduce客户端核心....................... SKIPPED
[INFO]的Hadoop-MA preduce-客户共同..................... SKIPPED
[INFO]的Hadoop-MA preduce客户端洗牌.................... SKIPPED
[INFO]的Hadoop-MA preduce客户端应用........................ SKIPPED
[INFO]的Hadoop-MA preduce客户端-HS ......................... SKIPPED
[INFO]的Hadoop-MA preduce-客户jobclient .................. SKIPPED
[INFO]的Hadoop-MA preduce客户端-HS-插件................. SKIPPED
[INFO]的Apache Hadoop的马preduce例子................... SKIPPED
[INFO]的Hadoop-MA preduce ................................... SKIPPED
[INFO]的Apache Hadoop的马preduce流.................. SKIPPED
[INFO] Apache的Hadoop分布式复制..................... SKIPPED
[INFO]的Apache Hadoop的档案............................. SKIPPED
[INFO]的Apache Hadoop的瘤胃................................ SKIPPED
[INFO]的Apache Hadoop的Gridmix .............................. SKIPPED
[INFO]的Apache Hadoop的数据加入............................ SKIPPED
[INFO]的Apache Hadoop的Ant任务............................ SKIPPED
[INFO]的Apache Hadoop的额外............................... SKIPPED
[INFO]的Apache Hadoop的管道................................ SKIPPED
[INFO]的Apache Hadoop的支持的OpenStack .................... SKIPPED
[INFO]的Apache Hadoop的Amazon Web Services的支持.......... SKIPPED
[INFO]的Apache Hadoop的客户............................... SKIPPED
[INFO]的Apache Hadoop的迷你集群......................... SKIPPED
[INFO]的Apache Hadoop的调度负载模拟器............. SKIPPED
[INFO]的Apache Hadoop的工具DIST ........................... SKIPPED
[INFO]的Apache Hadoop的工具................................ SKIPPED
[INFO] Apache的Hadoop发行版......................... SKIPPED
[INFO] ----------------------------------------------- -------------------------
[INFO]构建失败
[INFO] ----------------------------------------------- -------------------------
[INFO]总时间:01:57分
[INFO]在表面处理:2015-03-09T17:08:10 + 01:00
[INFO]最后的内存:88M / 1092M
[INFO] ----------------------------------------------- -------------------------
[错误]未能执行目标org.apache.maven.plugins:Maven的antrun - 插件:1.7:运行(使)项目Hadoop的HDFS:一个蚂蚁BuildException已发生:EXEC返回:1
[错误]蚂蚁周围部分...&LT; EXEC DIR =D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标/本土可执行程序=cmake的failonerror =真&GT; ... // @ 5:106 D:\\ HDP \\ Hadoop的HDFS项目\\的Hadoop-H
DFS \\目标\\ antrun \\集结的main.xml
[错误] - &GT; [求助1]
org.apache.maven.lifecycle.LifecycleExecutionException:未能执行目标org.apache.maven.plugins:Maven的antrun - 插件:1.7:项目Hadoop的HDFS运行(使):一种蚂蚁BuildExcep
化已发生:EXEC返回:1
围绕蚂蚁部分...&LT; EXEC DIR =D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标/本土可执行程序=cmake的failonerror =真&GT; ... // @ 5:106在D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\塔尔格
等\\ antrun \\集结的main.xml
        在org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
        在org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        在org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        在org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        在org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        在org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        在org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
        在org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
        在org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
        在org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
        在org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
        在org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
        在sun.reflect.NativeMethodAccessorImpl.invoke0(本机方法)
        在sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        在java.lang.reflect.Method.invoke(Method.java:606)
        在组织。codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
        在组织。codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
        在组织。codehaus.plexus.classworlds.launcher.Launcher.mainWithExit code(Launcher.java:415)
        在组织。codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
org.apache.maven.plugin.MojoExecutionException:通过引起了蚂蚁BuildException已发生:EXEC返回:1
围绕蚂蚁部分...&LT; EXEC DIR =D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标/本土可执行程序=cmake的failonerror =真&GT; ... // @ 5:106在D:\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\塔尔格
等\\ antrun \\集结的main.xml
        在org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
        在org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
        在org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
        ... 19个
D::引起\\ HDP \\ Hadoop的HDFS项目\\ Hadoop的HDFS \\目标\\ antrun \\集结的main.xml:5:EXEC返回:1
        在org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646)
        在org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
        在org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
        在org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
        在sun.reflect.GeneratedMethodAccessor21.invoke(来源不明)
        在sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        在java.lang.reflect.Method.invoke(Method.java:606)
        在org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        在org.apache.tools.ant.Task.perform(Task.java:348)
        在org.apache.tools.ant.Target.execute(Target.java:390)
        在org.apache.tools.ant.Target.performTasks(Target.java:411)
        在org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
        在org.apache.tools.ant.Project.executeTarget(Project.java:1368)
        在org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
        ...... 21多
[错误]
[错误]
[错误]有关错误和可能的解决方案,请阅读以下文章了解更多信息:
[错误] [说明1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[错误]
[错误]纠正问题后,您可以使用以下命令恢复建设
[错误] MVN&LT;目标&GT; -rf:Hadoop的HDFS

这就是我stucked,不知道如何解决这个问题。帮助是因为我是新来的Hadoop和建设综合新华社preciated。

谢谢!
ELRO


解决方案

我有同样的错误。通过确保CMake的不附带cygwin的一个固定它。出于某种原因,CMake不会承认的Visual Studio。

Hello I'm trying to build hadoop 2.6.0 on Windows 8.1. Unfortunately without luck so far.

I have installed:

  • jdk1.7.0_71 (added Variable JAVA_HOME with value C:\Program Files\Java\jdk1.7.0_71 to the User Variables)
  • cygwin64 (added its installation-directory as value D:\cygwin64\bin to the PATH Variable under System Variables)
  • Maven 3.2.5 (added its installation-directory as value D:\maven\bin to the PATH Variable under System Variables)
  • Protocol Buffer 2.5 (added its installation-directory as value D:\protobuf to the PATH Variable under System Variables)
  • Visual Studio 2010

In the Visual Studio 2010 Command Prompt (started as Administrator) I have changed the drive to d: and used the folder of the hadoop src as a starting point ("D:\hdp").

I do again set the JAVA_HOME in short notation and do set the Platform

set JAVA_HOME=C:\PROGRA~1\Java\jdk1.7.0_71
set Platform=x64

Afterwards I try to build hadoop using the following Maven command:

mvn -e -X package -Pdist -DskipTests -Dtar

After Building fails at "Apache Hadoop Common" stating the following:

[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.6.0:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: protoc failure -> [Help 1]

I have slightly changed the file "ProtocMojo.java" under D:\hdp\hadoop-maven-plugins\src\main\java\org\apache\hadoop\maven\plugin\protoc by inserting the following afer line 56 in the file:

protocCommand = "D:\\protobuf\\protoc.exe";

This helps to build further to "Apache Hadoop HDFS" where it fails again stating as follows:

  [INFO] Executing tasks
Build sequence for target(s) `main' is [main]
Complete build sequence is [main, ]
main:
    [mkdir] Skipping D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target\native because it already exists.
     [exec] Current OS is Windows 8.1
     [exec] Executing 'cmake' with arguments:
     [exec] 'D:\hdp\hadoop-hdfs-project\hadoop-hdfs/src/'
     [exec] '-DGENERATED_JAVAH=D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'
     [exec] '-DJVM_ARCH_DATA_MODEL=64'
     [exec] '-DREQUIRE_LIBWEBHDFS=false'
     [exec] '-DREQUIRE_FUSE=false'
     [exec] '-G'
     [exec] 'Visual Studio 10 Win64'
     [exec]
     [exec] The ' characters around the executable and arguments are
     [exec] not part of the command.
Execute:Java13CommandLauncher: Executing 'cmake' with arguments:
'D:\hdp\hadoop-hdfs-project\hadoop-hdfs/src/'
'-DGENERATED_JAVAH=D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native/javah'
'-DJVM_ARCH_DATA_MODEL=64'
'-DREQUIRE_LIBWEBHDFS=false'
'-DREQUIRE_FUSE=false'
'-G'
'Visual Studio 10 Win64'

The ' characters around the executable and arguments are
not part of the command.
     [exec] CMake Error: Could not create named generator Visual Studio 10 Win64
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [  0.906 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  0.719 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  1.469 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.265 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  1.766 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  5.516 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  1.431 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  2.119 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  1.969 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:11 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  4.087 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 11.742 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.110 s]
[INFO] Apache Hadoop HDFS ................................. FAILURE [ 11.782 s]
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SKIPPED
[INFO] hadoop-yarn ........................................ SKIPPED
[INFO] hadoop-yarn-api .................................... SKIPPED
[INFO] hadoop-yarn-common ................................. SKIPPED
[INFO] hadoop-yarn-server ................................. SKIPPED
[INFO] hadoop-yarn-server-common .......................... SKIPPED
[INFO] hadoop-yarn-server-nodemanager ..................... SKIPPED
[INFO] hadoop-yarn-server-web-proxy ....................... SKIPPED
[INFO] hadoop-yarn-server-applicationhistoryservice ....... SKIPPED
[INFO] hadoop-yarn-server-resourcemanager ................. SKIPPED
[INFO] hadoop-yarn-server-tests ........................... SKIPPED
[INFO] hadoop-yarn-client ................................. SKIPPED
[INFO] hadoop-yarn-applications ........................... SKIPPED
[INFO] hadoop-yarn-applications-distributedshell .......... SKIPPED
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SKIPPED
[INFO] hadoop-yarn-site ................................... SKIPPED
[INFO] hadoop-yarn-registry ............................... SKIPPED
[INFO] hadoop-yarn-project ................................ SKIPPED
[INFO] hadoop-mapreduce-client ............................ SKIPPED
[INFO] hadoop-mapreduce-client-core ....................... SKIPPED
[INFO] hadoop-mapreduce-client-common ..................... SKIPPED
[INFO] hadoop-mapreduce-client-shuffle .................... SKIPPED
[INFO] hadoop-mapreduce-client-app ........................ SKIPPED
[INFO] hadoop-mapreduce-client-hs ......................... SKIPPED
[INFO] hadoop-mapreduce-client-jobclient .................. SKIPPED
[INFO] hadoop-mapreduce-client-hs-plugins ................. SKIPPED
[INFO] Apache Hadoop MapReduce Examples ................... SKIPPED
[INFO] hadoop-mapreduce ................................... SKIPPED
[INFO] Apache Hadoop MapReduce Streaming .................. SKIPPED
[INFO] Apache Hadoop Distributed Copy ..................... SKIPPED
[INFO] Apache Hadoop Archives ............................. SKIPPED
[INFO] Apache Hadoop Rumen ................................ SKIPPED
[INFO] Apache Hadoop Gridmix .............................. SKIPPED
[INFO] Apache Hadoop Data Join ............................ SKIPPED
[INFO] Apache Hadoop Ant Tasks ............................ SKIPPED
[INFO] Apache Hadoop Extras ............................... SKIPPED
[INFO] Apache Hadoop Pipes ................................ SKIPPED
[INFO] Apache Hadoop OpenStack support .................... SKIPPED
[INFO] Apache Hadoop Amazon Web Services support .......... SKIPPED
[INFO] Apache Hadoop Client ............................... SKIPPED
[INFO] Apache Hadoop Mini-Cluster ......................... SKIPPED
[INFO] Apache Hadoop Scheduler Load Simulator ............. SKIPPED
[INFO] Apache Hadoop Tools Dist ........................... SKIPPED
[INFO] Apache Hadoop Tools ................................ SKIPPED
[INFO] Apache Hadoop Distribution ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:57 min
[INFO] Finished at: 2015-03-09T17:08:10+01:00
[INFO] Final Memory: 88M/1092M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildException has occured: exec returned: 1
[ERROR] around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... @ 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-h
dfs\target\antrun\build-main.xml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-hdfs: An Ant BuildExcep
tion has occured: exec returned: 1
around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... @ 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-hdfs\targ
et\antrun\build-main.xml
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
        at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
        at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:120)
        at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:355)
        at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
        at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
        at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:216)
        at org.apache.maven.cli.MavenCli.main(MavenCli.java:160)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
        at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
        at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
        at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: An Ant BuildException has occured: exec returned: 1
around Ant part ...<exec dir="D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target/native" executable="cmake" failonerror="true">... @ 5:106 in D:\hdp\hadoop-hdfs-project\hadoop-hdfs\targ
et\antrun\build-main.xml
        at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:355)
        at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:132)
        at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
        ... 19 more
Caused by: D:\hdp\hadoop-hdfs-project\hadoop-hdfs\target\antrun\build-main.xml:5: exec returned: 1
        at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:646)
        at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:672)
        at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:498)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
        at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:348)
        at org.apache.tools.ant.Target.execute(Target.java:390)
        at org.apache.tools.ant.Target.performTasks(Target.java:411)
        at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1399)
        at org.apache.tools.ant.Project.executeTarget(Project.java:1368)
        at org.apache.maven.plugin.antrun.AntRunMojo.execute(AntRunMojo.java:327)
        ... 21 more
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs

This is where I'm stucked and do not know how to solve this problem. Help is appreciated as I am new to hadoop and building in general.

Thanks! elro

解决方案

I had the same error. Fixed it by making sure the cmake wasn't the one that comes with cygwin. For some reason, that cmake doesn't recognize Visual Studio.

这篇关于Hadoop的2.6.0版本在Windows 8.1中失败 - 蚂蚁BuildException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆