异常螺纹"主" java.lang.NoClassDefFoundError的:组织/阿帕奇/火花/ SQL /催化剂/分析/ OverrideFunctionRegistry [英] Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/sql/catalyst/analysis/OverrideFunctionRegistry

查看:744
本文介绍了异常螺纹"主" java.lang.NoClassDefFoundError的:组织/阿帕奇/火花/ SQL /催化剂/分析/ OverrideFunctionRegistry的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我曾尝试与低于code火花和Scala,附加code和pom.xml的

 包com.Spark.ConnectToHadoop进口org.apache.spark.SparkConf
进口org.apache.spark.SparkConf
进口org.apache.spark._
进口org.apache.spark.sql._
进口org.apache.spark.sql.hive.HiveContext
进口org.apache.spark.sql.SQLContext
进口org.apache.spark.rdd.RDD
//进口groovy.sql.Sql.CreateStatementCommand//进口org.apache.spark.SparkConf
对象CountWords {  高清主(参数:数组[字符串]){    。VAL objConf =新SparkConf()setAppName(星火连接).setMaster(火花:// IP:7077)
    VAR SC =新SparkContext(objConf)
VAL objHiveContext =新HiveContext(SC)
objHiveContext.sql(USE测试)
VAR测试= objHiveContext.sql(显示表)
    变种I = 0    VAR测试= test.collect()
      对于(I< -0直到testing.length){      的println(测试(I))
    }
  }
}

我已经加入火花core_2.10,火花catalyst_2.10,火花sql_2.10,火花hive_2.10依赖我是否需要添加更多的依赖???

编辑:

 <项目的xmlns =htt​​p://maven.apache.org/POM/4.0.0的xmlns:XSI =htt​​p://www.w3.org/2001 / XML模式实例
  XSI:的schemaLocation =htt​​p://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd\">
    < modelVersion> 4.0.0< / modelVersion>    <&的groupId GT; com.Sudhir.Maven1< /的groupId>
    <&的artifactId GT; SparkDemo< / artifactId的>
    <&版GT; IntervalMeterData1< /版本>
    <包装和GT;&罐子LT; /包装>    <名称>&SparkDemo LT; /名称>
    &LT; URL&GT; HTTP://maven.apache.org< / URL&GT;    &LT;性状&gt;
        &LT; project.build.sourceEncoding&GT; UTF-8&LT; /project.build.sourceEncoding>
        &LT; spark.version&GT; 1.5.2&LT; /spark.version>
    &LT; /性状&gt;    &LT;依赖和GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
            &LT;&的artifactId GT;火花core_2.10&LT; / artifactId的&GT;
            &LT;&版GT; 1.5.2&LT; /版本&GT;
        &LT; /依赖性&GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
            &LT;&的artifactId GT;火花sql_2.10&LT; / artifactId的&GT;
            &LT;&版GT; 1.5.2&LT; /版本&GT;
        &LT; /依赖性&GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
            &LT;&的artifactId GT;火花catalyst_2.10&LT; / artifactId的&GT;
            &LT;&版GT; 1.5.2&LT; /版本&GT;
        &LT; /依赖性&GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
            &LT;&的artifactId GT;火花hive_2.10&LT; / artifactId的&GT;
            &LT;&版GT; 1.2.1&LT; /版本&GT;
        &LT; /依赖性&GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT; org.apache.hive&LT; /的groupId&GT;
            &LT;&的artifactId GT;蜂巢-JDBC&LT; / artifactId的&GT;
            &LT;&版GT; 1.2.1&LT; /版本&GT;
        &LT; /依赖性&GT;
        &LT;&依赖性GT;
            &LT;&的groupId GT;的JUnit&LT; /的groupId&GT;
            &LT;&的artifactId GT;的JUnit&LT; / artifactId的&GT;
            &LT;&版GT; 3.8.1&LT; /版本&GT;
            &LT;&范围GT;试验&LT; /&范围GT;
        &LT; /依赖性&GT;
    &LT; /依赖和GT;
&LT; /项目&GT;


解决方案

您似乎忘记撞击的火花蜂巢:

 &LT;&依赖性GT;
        &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
        &LT;&的artifactId GT;火花hive_2.10&LT; / artifactId的&GT;
        &LT;&版GT; 1.5.2&LT; /版本&GT;
    &LT; /依赖性&GT;

考虑引进Maven的变量,如spark.version。

 &LT;性状&gt;
        &LT; spark.version&GT; 1.5.2&LT; /spark.version>
    &LT; /性状&gt;

和以这种方式修改所有的火花依赖:

 &LT;&依赖性GT;
        &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
        &LT;&的artifactId GT;火花hive_2.10&LT; / artifactId的&GT;
        &LT;&版GT; $ {spark.version}&LT; /版本&GT;
    &LT; /依赖性&GT;

跳车了火花版本不会那样痛苦。

只是增加了财产 spark.version 你的&LT;性状&gt; 是不够的,你必须调用它与 $ {spark.version} 依赖关系。

I have tried with below code in spark and scala, attaching code and pom.xml

package com.Spark.ConnectToHadoop

import org.apache.spark.SparkConf
import org.apache.spark.SparkConf
import org.apache.spark._
import org.apache.spark.sql._
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.SQLContext
import org.apache.spark.rdd.RDD
//import groovy.sql.Sql.CreateStatementCommand

//import org.apache.spark.SparkConf


object CountWords  {

  def main(args:Array[String]){

    val objConf = new SparkConf().setAppName("Spark Connection").setMaster("spark://IP:7077")
    var sc = new SparkContext(objConf)
val objHiveContext = new HiveContext(sc)
objHiveContext.sql("USE test")


var test= objHiveContext.sql("show tables")
    var i  = 0

    var testing = test.collect()
      for(i<-0 until testing.length){

      println(testing(i))
    }
  }
}

I have added spark-core_2.10,spark-catalyst_2.10,spark-sql_2.10,spark-hive_2.10 dependencies Do I need to add any more dependencies???

Edit:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.Sudhir.Maven1</groupId>
    <artifactId>SparkDemo</artifactId>
    <version>IntervalMeterData1</version>
    <packaging>jar</packaging>

    <name>SparkDemo</name>
    <url>http://maven.apache.org</url>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <spark.version>1.5.2</spark.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.5.2</version>
        </dependency> 
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.5.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-catalyst_2.10</artifactId>
            <version>1.5.2</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-hive_2.10</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hive</groupId>
            <artifactId>hive-jdbc</artifactId>
            <version>1.2.1</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>3.8.1</version>
            <scope>test</scope>
        </dependency>     
    </dependencies>
</project>

解决方案

Looks like you forgot to bump the spark-hive:

    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>1.5.2</version>
    </dependency>

Consider introducing maven variable, like spark.version.

   <properties>
        <spark.version>1.5.2</spark.version>
    </properties>

And modifying all your spark dependencies in this manner:

   <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-hive_2.10</artifactId>
        <version>${spark.version}</version>
    </dependency>

Bumping up versions of spark won't be as painful.

Just adding the property spark.version in your <properties> is not enough, you have to call it with ${spark.version} in dependencies.

这篇关于异常螺纹&QUOT;主&QUOT; java.lang.NoClassDefFoundError的:组织/阿帕奇/火花/ SQL /催化剂/分析/ OverrideFunctionRegistry的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆