spark 抛出 java.lang.NoClassDefFoundError: kafka/common/TopicAndPartition [英] spark throws java.lang.NoClassDefFoundError: kafka/common/TopicAndPartition

查看:76
本文介绍了spark 抛出 java.lang.NoClassDefFoundError: kafka/common/TopicAndPartition的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我在 Cloudera Yarn 环境中使用 spark-submit 命令时,我得到了这种异常:

when I use spark-submit command in Cloudera Yarn environment, I got this kind of exception:

java.lang.NoClassDefFoundError: kafka/common/TopicAndPartition
    at java.lang.Class.getDeclaredMethods0(Native Method)
    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
    at java.lang.Class.getDeclaredMethods(Class.java:1975)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$.com$fasterxml$jackson$module$scala$introspect$BeanIntrospector$$listMethods$1(BeanIntrospector.scala:93)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$.findMethod$1(BeanIntrospector.scala:99)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$.com$fasterxml$jackson$module$scala$introspect$BeanIntrospector$$findGetter$1(BeanIntrospector.scala:124)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$$anonfun$3$$anonfun$apply$5.apply(BeanIntrospector.scala:177)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$$anonfun$3$$anonfun$apply$5.apply(BeanIntrospector.scala:173)
    at scala.collection.TraversableLike$WithFilter$$anonfun$map$2.apply(TraversableLike.scala:722)
    at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
    at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
    at scala.collection.TraversableLike$WithFilter.map(TraversableLike.scala:721)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$$anonfun$3.apply(BeanIntrospector.scala:173)
    at com.fasterxml.jackson.module.scala.introspect.BeanIntrospector$$anonfun$3.apply(BeanIntrospector.scala:172)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
    at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
    at scala.collection.immutable.List.foreach(List.scala:318)
...

spark-submit 命令是这样的:

The spark-submit command is like:

spark-submit --master yarn-cluster \
        --num-executors $2 \
        --executor-cores $3 \
        --class "APP" \
        --deploy-mode cluster \
        --properties-file $1 \
        --files $HDFS_PATH/log4j.properties,$HDFS_PATH/metrics.properties \
        --conf spark.metrics.conf=metrics.properties \
        APP.jar

请注意,TopicAndPartition.class 位于阴影 APP.jar 中.

note that, TopicAndPartition.class is in shaded APP.jar.

推荐答案

使用了一些方法后,发现是版本不兼容导致的问题.正如@user1050619 所说,确保kafka、spark、zookeeper 和scala 的版本相互兼容.

After using some methods, it turns out that the issue is caused because version incompatibility. As @user1050619 said, make sure the version of kafka, spark, zookeeper and scala are compatible with each other.

这篇关于spark 抛出 java.lang.NoClassDefFoundError: kafka/common/TopicAndPartition的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆