Apache Spark,创建配置单元上下文 - NoSuchMethodException [英] Apache Spark, creating hive context - NoSuchMethodException

查看:555
本文介绍了Apache Spark,创建配置单元上下文 - NoSuchMethodException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有以下问题,我的主要方法是:

pre $ static public main(String args []){
SparkConf conf = new SparkConf()。setAppName(TestHive);
SparkContext sc = new org.apache.spark.SparkContext(conf);
HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);
}

我用 mvn package
然后我提交我的代码,但是我得到以下异常。我不知道什么是错的:

  sh spark-submit --classTestHive--master local [4]〜/ target / test-1.0-SNAPSHOT -jar -with-dependencies.jar 

线程main中的异常java.lang.NoSuchMethodException:org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org .apache.hadoop.hive.conf.HiveConf $ ConfVars,java.util.concurrent.TimeUnit)

请告诉我,我错了。

我用蜂房和thriftServer建立了我的火花。

 为Hadoop 2.4.0建立的Spark 1.5.2 
构建标志:-Psparkr -Phadoop-2.4 -Phive - Phive-thriftserver -Pyarn


解决方案

好像是版本冲突在spark组件之间(spark-core,spark-sql和spark-hive)

为了避免这种情况,这些组件的所有版本都应该是相同的。您可以在 pom.xml 中通过设置名为spark.version的属性来执行此操作,例如:

 <性状> 
< spark.version> 1.6.0< /spark.version>
< / properties>
<依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-core_2.10< / artifactId>
< version> $ {spark.version}< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-hive_2.10< / artifactId>
< version> $ {spark.version}< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-sql_2.10< / artifactId>
< version> $ {spark.version}< / version>
< /依赖关系>
< dependency>
< groupId> org.apache.spark< / groupId>
< artifactId> spark-streaming_2.10< / artifactId>
< version> $ {spark.version}< / version>
< /依赖关系>
< /依赖关系>


I have a following problem, my main method is:

static public void main(String args[]){
     SparkConf conf = new SparkConf().setAppName("TestHive");
     SparkContext sc = new org.apache.spark.SparkContext(conf);
     HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc);    
}

And I build it with mvn package Then I submit my code, however I get following exception. I have no idea what's wrong:

sh spark-submit --class "TestHive" --master local[4] ~/target/test-1.0-SNAPSHOT-jar-with-dependencies.jar 

Exception in thread "main" java.lang.NoSuchMethodException: org.apache.hadoop.hive.conf.HiveConf.getTimeVar(org.apache.hadoop.hive.conf.HiveConf$ConfVars, java.util.concurrent.TimeUnit)

Tell me please, where I am wrong.

PS I built my spark with hive and thriftServer.

Spark 1.5.2 built for Hadoop 2.4.0
Build flags: -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn

解决方案

It seems to be version conflict between spark components (spark-core, spark-sql and spark-hive)

To avoid this conflit all versions of those components should be the same. You could do that in your pom.xml by setting a peroperty called spark.version for example:

<properties>
    <spark.version>1.6.0</spark.version>
</properties>
<dependencies>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-streaming_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>
</dependencies>

这篇关于Apache Spark,创建配置单元上下文 - NoSuchMethodException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆