Zeppelin + Spark:从 S3 读取 Parquet 抛出 NoSuchMethodError:com.fasterxml.jackson [英] Zeppelin + Spark: Reading Parquet from S3 throws NoSuchMethodError: com.fasterxml.jackson

查看:22
本文介绍了Zeppelin + Spark:从 S3 读取 Parquet 抛出 NoSuchMethodError:com.fasterxml.jackson的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

使用来自主要下载的 Zeppelin 0.7.2 二进制文件和 Spark 2.1.0 w/Hadoop 2.6,以下段落:

Using Zeppelin 0.7.2 binaries from the main download, and Spark 2.1.0 w/ Hadoop 2.6, the following paragraph:

val df = spark.read.parquet(DATA_URL).filter(FILTER_STRING).na.fill("")

产生以下内容:

java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;
  at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<init>(ScalaNumberDeserializersModule.scala:49)
  at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<clinit>(ScalaNumberDeserializersModule.scala)
  at com.fasterxml.jackson.module.scala.deser.ScalaNumberDeserializersModule$class.$init$(ScalaNumberDeserializersModule.scala:61)
  at com.fasterxml.jackson.module.scala.DefaultScalaModule.<init>(DefaultScalaModule.scala:20)
  at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<init>(DefaultScalaModule.scala:37)
  at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<clinit>(DefaultScalaModule.scala)
  at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
  at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
  at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
  at org.apache.spark.SparkContext.parallelize(SparkContext.scala:715)
  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat$.mergeSchemasInParallel(ParquetFileFormat.scala:594)
  at org.apache.spark.sql.execution.datasources.parquet.ParquetFileFormat.inferSchema(ParquetFileFormat.scala:235)
  at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$7.apply(DataSource.scala:184)
  at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$7.apply(DataSource.scala:184)
  at scala.Option.orElse(Option.scala:289)
  at org.apache.spark.sql.execution.datasources.DataSource.org$apache$spark$sql$execution$datasources$DataSource$$getOrInferFileFormatSchema(DataSource.scala:183)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:387)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
  at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:441)
  at org.apache.spark.sql.DataFrameReader.parquet(DataFrameReader.scala:425)
  ... 47 elided

这个错误不会在普通的 spark-shell 中发生,只会在 Zeppelin 中发生.我尝试了以下修复,但没有任何作用:

This error does not happen in the normal spark-shell, only in Zeppelin. I have attempted the following fixes, which do nothing:

  • 将 jackson 2.6.2 jars 下载到 zeppelin lib 文件夹并重启
  • 将 jackson 2.9 依赖项从 Maven 存储库添加到解释器设置
  • 从 zeppelin lib 文件夹中删除 jackson jar

谷歌搜索没有出现类似的情况.请不要犹豫,询问更多信息,或提出建议.谢谢!

Googling is turning up no similar situations. Please don't hesitate to ask for more information, or make suggestions. Thanks!

推荐答案

我遇到了同样的问题.我添加了 com.amazonaws:aws-java-sdkorg.apache.hadoop:hadoop-aws 作为 Spark 解释器的依赖项.这些依赖项引入了它们自己的 com.fasterxml.jackson.core:* 版本并与 Spark 的版本冲突.

I had the same problem. I added com.amazonaws:aws-java-sdk and org.apache.hadoop:hadoop-aws as dependencies for the Spark interpreter. These dependencies bring in their own versions of com.fasterxml.jackson.core:* and conflict with Spark's.

您还必须从其他依赖项中排除 com.fasterxml.jackson.core:*,这是一个示例 ${ZEPPELIN_HOME}/conf/interpreter.json Spark 解释器依赖部分:

You also must exclude com.fasterxml.jackson.core:* from other dependencies, this is an example ${ZEPPELIN_HOME}/conf/interpreter.json Spark interpreter depenency section:

<代码>依赖项":[{"groupArtifactVersion": "com.amazonaws:aws-java-sdk:1.7.4",本地":假,排除":[com.fasterxml.jackson.core:*"]},{"groupArtifactVersion": "org.apache.hadoop:hadoop-aws:2.7.1",本地":假,排除":[com.fasterxml.jackson.core:*"]}]

这篇关于Zeppelin + Spark:从 S3 读取 Parquet 抛出 NoSuchMethodError:com.fasterxml.jackson的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆