当我尝试在spark上解析Json时java.lang.NoSuchMethodError [英] java.lang.NoSuchMethodError when I try to parse Json on spark

查看:158
本文介绍了当我尝试在spark上解析Json时java.lang.NoSuchMethodError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我尝试在spark上使用com.typesafe.play play-json 2.4.0
时,我遇到了一些问题。
下面的代码在spark服务器上做了一个例外,但它在我的电脑上完美运行。

I've got an issues when i'm trying to use com.typesafe.play play-json 2.4.0 on spark. The follwing code make an exception on the spark server but it works perfectly on my pc.

val json = Json.parse(json_string)

例外:

java.lang.NoSuchMethodError: com.fasterxml.jackson.core.JsonToken.id()I
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:122)
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:108)
    at play.api.libs.json.jackson.JsValueDeserializer.deserialize(JacksonJson.scala:103)
    at com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:2860)
    at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:1569)
    at play.api.libs.json.jackson.JacksonJson$.parseJsValue(JacksonJson.scala:226)
    at play.api.libs.json.Json$.parse(Json.scala:21)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$.ParseJson(TwitterToCassandraPostsParser.scala:74)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$$anonfun$1$$anonfun$apply$1.apply(TwitterToCassandraPostsParser.scala:65)
    at org.soprism.kafka.connector.TwitterToCassandraPostsParser$$anonfun$1$$anonfun$apply$1.apply(TwitterToCassandraPostsParser.scala:65)
    at scala.collection.Iterator$class.foreach(Iterator.scala:727)
    at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:798)
    at org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:798)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
    at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:1503)
    at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
    at org.apache.spark.scheduler.Task.run(Task.scala:64)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:203)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)

我使用spark-submit命令执行它

I use a spark-submit command to execute it

这两个版本的jackson库之间似乎是不兼容的。我该如何解决?

It seems to be an incompatibily between two versions of jackson's library. How can I fix it ?

谢谢

推荐答案

Spark节点不会检查您的依赖项。您需要构建一个包含所有依赖项的超级jar,并将其传递给Spark,以便分发到不同的节点。

Spark nodes will NOT check your dependencies. You need to build an uber-jar that includes all your dependencies and pass it to Spark in order to be distributed to the different node.

这篇关于当我尝试在spark上解析Json时java.lang.NoSuchMethodError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆