无法从 Apache Spark SQL 1.5.2 在 SQLContext 中运行查询,得到 java.lang.NoSuchMethodError [英] Cannot run queries in SQLContext from Apache Spark SQL 1.5.2, getting java.lang.NoSuchMethodError

查看:27
本文介绍了无法从 Apache Spark SQL 1.5.2 在 SQLContext 中运行查询,得到 java.lang.NoSuchMethodError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个使用 Spark SQL(Spark 1.5.2,使用本地模式)的 Java 应用程序,但如果没有,我无法执行任何 SQL 命令出现错误.

这是我正在执行的代码:

//confsSparkConf sparkConf = new SparkConf();sparkConf.set("spark.master","local");sparkConf.set("spark.app.name","application01");sparkConf.set("spark.driver.host","10.1.1.36");sparkConf.set("spark.driver.port", "51810");sparkConf.set("spark.executor.port", "51815");sparkConf.set("spark.repl.class.uri","http://10.1.1.36:46146");sparkConf.set("spark.executor.instances","2");sparkConf.set("spark.jars","");sparkConf.set("spark.executor.id","driver");sparkConf.set("spark.submit.deployMode","client");sparkConf.set("spark.fileserver.uri","http://10.1.1.36:47314");sparkConf.set("spark.localProperties.clone","true");sparkConf.set("spark.app.id","app-45631207172715-0002");//初始化上下文JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);SQLContext sqlContext = new SQLContext(sparkContext);//执行命令sqlContext.sql("显示表").show();

pom.xml 中的 Spark 依赖项如下所示:

<依赖><groupId>org.apache.spark</groupId><artifactId>spark-core_2.10</artifactId><version>1.5.2</version></依赖><依赖><groupId>org.apache.spark</groupId><artifactId>spark-sql_2.10</artifactId><version>1.5.2</version></依赖><依赖><groupId>org.apache.spark</groupId><artifactId>spark-hive_2.10</artifactId><version>1.5.2</version></依赖><依赖><groupId>org.apache.spark</groupId><artifactId>spark-repl_2.10</artifactId><version>1.5.2</version></依赖>

这是我得到的错误:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;

堆栈跟踪位于此处.

我的应用程序是在 Tomcat 7 上运行的 Web 应用程序.我没有任何其他配置文件.我可能做错了什么?会不会是一些依赖冲突,因为我可以在一个干净的项目中运行相同的代码?

编辑:我发现了一个问题提供了有关该问题的更多信息.

解决方案

BigDecimalDeserializer 直到 2.4 才被引入到 FasterXML/jackson-module-scala.确认以下内容:

  1. 与您编译的 jar 相同的 jar 在运行时位于类路径中.
  2. ${fasterxml.jackson.version}Spark SQL 的 pom.xml 文件 是 2.4.x 或更高版本.

<块引用>

<依赖><groupId>com.fasterxml.jackson.core</groupId><artifactId>jackson-databind</artifactId><version>2.4.4</version></依赖>

I have a Java application using Spark SQL (Spark 1.5.2 using local mode), but I cannot execute any SQL commands without getting errors.

This is the code I am executing:

//confs
SparkConf sparkConf = new SparkConf();  
sparkConf.set("spark.master","local");
sparkConf.set("spark.app.name","application01");
sparkConf.set("spark.driver.host","10.1.1.36");
sparkConf.set("spark.driver.port", "51810");
sparkConf.set("spark.executor.port", "51815");
sparkConf.set("spark.repl.class.uri","http://10.1.1.36:46146");
sparkConf.set("spark.executor.instances","2");
sparkConf.set("spark.jars","");
sparkConf.set("spark.executor.id","driver");
sparkConf.set("spark.submit.deployMode","client");
sparkConf.set("spark.fileserver.uri","http://10.1.1.36:47314");
sparkConf.set("spark.localProperties.clone","true");
sparkConf.set("spark.app.id","app-45631207172715-0002");

//Initialize contexts
JavaSparkContext sparkContext = new JavaSparkContext(sparkConf);
SQLContext sqlContext = new SQLContext(sparkContext);           

//execute command
sqlContext.sql("show tables").show();

Spark dependencies in pom.xml look like this:

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-core_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-sql_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-hive_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

<dependency>
  <groupId>org.apache.spark</groupId>
  <artifactId>spark-repl_2.10</artifactId>
  <version>1.5.2</version>
</dependency>

Here is the error I am getting:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;

The stack trace is here.

My application is a web application running on Tomcat 7. I don't have any other configuration files. What could I be doing wrong? Could it be some dependency conflict, since I am able to run the same code in a clean project?

EDIT: I found an issue that gives some more information about the problem.

解决方案

BigDecimalDeserializer wasn't introduced to FasterXML/jackson-module-scala until 2.4. Confirm the following:

  1. The same jars you compile with are on the classpath at runtime.
  2. ${fasterxml.jackson.version} in the pom.xml file for Spark SQL is 2.4.x or greater.

<dependency>
  <groupId>com.fasterxml.jackson.core</groupId>
  <artifactId>jackson-databind</artifactId>
  <version>2.4.4</version>
</dependency>

这篇关于无法从 Apache Spark SQL 1.5.2 在 SQLContext 中运行查询,得到 java.lang.NoSuchMethodError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆