星火运行错误java.lang.NoClassDefFoundError的:组织/ codehaus /杰克逊/注释/ JsonClass [英] Spark running error java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass

查看:1865
本文介绍了星火运行错误java.lang.NoClassDefFoundError的:组织/ codehaus /杰克逊/注释/ JsonClass的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

 进口org.apache.spark.SparkContext._
进口org.apache.spark.SparkConf
进口play.api.libs.json._
进口java.util.Date
进口javax.xml.bind.DatatypeConverter
对象测试{
高清主(参数:数组[字符串]):单位= {
    VAL LOGFILE =的test.txt
    VAL的conf =新SparkConf()。setAppName(Json的测试)
    VAL SC =新SparkContext(CONF)
    尝试{
        VAL出=输出/测试
        VAL logData = sc.textFile(日志文件,2).MAP(行=> Json.parse(cleanTypo(线)))。缓存()    } {最后
        sc.stop()
    }
}

由于这是说,大约星火杰克逊冲突的问题,我一直在使用星火重建
     MVN版本:用最新版本-Dincludes =组织codehaus.jackson:杰克逊核心-ASL
     MVN版本:用最新版本-Dincludes =组织codehaus.jackson:杰克逊映射器-ASL

所以罐子已更新的1.9.x
但我仍然有误差

  15/03/02三点12分19秒错误执行人:异常的任务在0.0阶段0.0(TID 0)
java.lang.NoClassDefFoundError的:组织/ codehaus /杰克逊/注释/ JsonClass
在org.$c$chaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
在org.$c$chaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
在org.$c$chaus.jackson.map.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:427)
在org.$c$chaus.jackson.map.deser.StdDeserializerProvider._createDeserializer(StdDeserializerProvider.java:398)
在org.$c$chaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:307)
在org.$c$chaus.jackson.map.deser.StdDeserializerProvider._createAndCacheValueDeserializer(StdDeserializerProvider.java:287)
在org.$c$chaus.jackson.map.deser.StdDeserializerProvider.findValueDeserializer(StdDeserializerProvider.java:136)
在org.$c$chaus.jackson.map.deser.StdDeserializerProvider.findTypedValueDeserializer(StdDeserializerProvider.java:157)
在组织。codehaus.jackson.map.ObjectMapper._findRootDeserializer(ObjectMapper.java:2468)
在组织。codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2383)
在组织。codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1094)
在play.api.libs.json.JacksonJson $ .parseJsValue(JsValue.scala:477)
在play.api.libs.json.Json $ .parse(Json.scala:16)


解决方案

我们击中几乎相同的问题。我们试图使用1.9.2打了一个这样的方法错误也是如此。

烦人,不仅有1版本冲突的处理,但2.所有Spark首先要看的Hadoop(用于HDFS)依赖于杰克逊JSON的1.8.x的版本,这是您所看到的冲突。星火(至少1.2+),然后使用杰克逊2.4.4实际上得到搬到com.fasterxml.jackson.core所以核心它实际上并没有与1.8.x的,由于不同的包名发生冲突。

所以你的情况你code应该工作,如果你做1 3件事情:


  1. 升级到2.4.x的版本小于或等于2.4.4因为实际的依存度将火花是2.4.4更换(在写这篇的时间)

  2. 降级到1.8.x的小于或等于1.8.x的版本而Hadoop是使用

  3. 编译1.9.x的版本下的火花。我知道你提到这一点,它没有工作,但是当我们尝试了很成功,我们跑的选项-D codehaus.jackson.version = 1.9.2
  4. 构建

有将是一个很大更多的问题是这样来不幸的是,由于火花的性质,以及它如何已拥有自己的内部依赖的类路径上,所以任何工作依赖这种冲突永远不会工作了。星火已经做了一些依赖阴影,以避免像番石榴包这个问题,但现在还没有与杰克逊进行。

import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import play.api.libs.json._
import java.util.Date
import javax.xml.bind.DatatypeConverter
object Test {
def main(args:Array[String]): Unit = {
    val logFile="test.txt"
    val conf=new SparkConf().setAppName("Json Test")
    val sc = new SparkContext(conf)
    try {
        val out= "output/test"
        val logData=sc.textFile(logFile,2).map(line => Json.parse(cleanTypo(line))).cache()

    } finally { 
        sc.stop()
    }
}

Since it was said about the Spark jackson conflict problem, I have rebuilt Spark using mvn versions:use-latest-versions -Dincludes=org.codehaus.jackson:jackson-core-asl mvn versions:use-latest-versions -Dincludes=org.codehaus.jackson:jackson-mapper-asl

So the jars have been updated to 1.9.x But I still have the error

15/03/02 03:12:19 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
at      org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
at org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
at org.codehaus.jackson.map.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:427)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createDeserializer(StdDeserializerProvider.java:398)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCache2(StdDeserializerProvider.java:307)
at org.codehaus.jackson.map.deser.StdDeserializerProvider._createAndCacheValueDeserializer(StdDeserializerProvider.java:287)
at org.codehaus.jackson.map.deser.StdDeserializerProvider.findValueDeserializer(StdDeserializerProvider.java:136)
at    org.codehaus.jackson.map.deser.StdDeserializerProvider.findTypedValueDeserializer(StdDeserializerProvider.java:157)
at     org.codehaus.jackson.map.ObjectMapper._findRootDeserializer(ObjectMapper.java:2468)
at org.codehaus.jackson.map.ObjectMapper._readValue(ObjectMapper.java:2383)
at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1094)
at play.api.libs.json.JacksonJson$.parseJsValue(JsValue.scala:477)
at play.api.libs.json.Json$.parse(Json.scala:16)

解决方案

We hit almost the exact same issue. We were trying to use 1.9.2 but hit a no such method error as well.

Annoyingly there is not only 1 version conflict to deal with but 2. First of all Spark depends on Hadoop (for hdfs) which depends on a 1.8.x build of the jackson json and this is the conflict which you are seeing. Spark (at least 1.2+) then uses the jackson 2.4.4 core which actually got moved to com.fasterxml.jackson.core so it does not actually conflict with 1.8.x due to the different package names.

So in your case your code should work if you do 1 of 3 things:

  1. upgrade to 2.4.x build that is LESS THAN OR EQUAL TO 2.4.4 since the actual dependency will be replaced by sparks which is 2.4.4 (at the time of writing this)
  2. downgrade to 1.8.x that is LESS THAN OR EQUAL TO the 1.8.x build which hadoop is using
  3. compile spark under your 1.9.x build. I know you mention this and that it didn't work but when we tried it was successful and we ran the build with the option -Dcodehaus.jackson.version=1.9.2

There are going to be a lot more issues like this to come unfortunately due to the nature of spark and how it already has all of its own internal dependencies on the classpath so any job dependencies that conflict will never work out. Spark already does some dependency shading to avoid this issue with packages like guava but this is not currently done with jackson.

这篇关于星火运行错误java.lang.NoClassDefFoundError的:组织/ codehaus /杰克逊/注释/ JsonClass的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆