在Hadoop 2.2.0行家依赖火花0.9.1 [英] spark 0.9.1 on hadoop 2.2.0 maven dependency
问题描述
我设置Apache Maven的星火在依赖pom.xml中如下:
I set up Apache Spark maven dependency in pom.xml as follows
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>0.9.1</version>
</dependency>
不过,我发现,这种依赖性使用 Hadoop的客户1.0.4.jar 和 Hadoop的核心1.0.4.jar ,而当我跑我的节目,我得到了错误 org.apache.hadoop.ipc.RemoteException:服务器IPC版本9无法与客户端版本4 沟通,这表明我需要从1.0切换的Hadoop版本。 4 2.2.0。
But I found that this dependency use "hadoop-client-1.0.4.jar" and "hadoop-core-1.0.4.jar", and when I run my program I got the error "org.apache.hadoop.ipc.RemoteException: Server IPC version 9 cannot communicate with client version 4", which shows that I need to switch hadoop version from 1.0.4 to 2.2.0.
更新
在下面的解决方案来解决这个问题的一个正确的方法?
Is the following solution a correct method to solve this problem?
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>0.9.1</version>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
</exclusion>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.2.0</version>
</dependency>
您的帮助非常感谢。
Many thanks for your help.
推荐答案
星火1.2.0依赖于Hadoop的2.2.0为默认值。
如果您可以更新您的火花依赖性1.2.0(或更新版本),将解决这个问题。
Spark 1.2.0 depends on hadoop 2.2.0 be default. If you can update your spark dependency to 1.2.0 (or newer) that will solve the problem.
这篇关于在Hadoop 2.2.0行家依赖火花0.9.1的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!