SBT 测试错误:java.lang.NoSuchMethodError:net.jpountz.lz4.LZ4BlockInputStream [英] SBT Test Error: java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream

查看:31
本文介绍了SBT 测试错误:java.lang.NoSuchMethodError:net.jpountz.lz4.LZ4BlockInputStream的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

当我尝试使用 scalatest 在 SBT 窗口上对我的 spark 流代码执行单元测试时遇到异常.

Getting Below exception , when i tried to perform unit tests for my spark streaming code on SBT windows using scalatest.

sbt testOnly <<ClassName>>

sbt testOnly <<ClassName>>

*
*
*
*
*
*

*
*
*
*
*
*

2018-06-18 02:39:00 错误执行程序:91 - 阶段 3.0 (TID 11) 中的任务 1.0 异常java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.(Ljava/io/InputStream;Z)V在 org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122)在 org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163)在 org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124)在 org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50)在 org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50)在 org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:417)在 org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:61)在 scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)在 scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)在 scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)在 org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32)在 org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)在 scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)在 org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.sort_addToSorter$(未知来源)在 org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(未知来源)在 org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)在 org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614)在 org.apache.spark.sql.execution.GroupedIterator$.apply(GroupedIterator.scala:29)在 org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec$StateStoreUpdater.updateStateForKeysWithData(FlatMapGroupsWithStateExec.scala:176)**

2018-06-18 02:39:00 ERROR Executor:91 - Exception in task 1.0 in stage 3.0 (TID 11) java.lang.NoSuchMethodError: net.jpountz.lz4.LZ4BlockInputStream.(Ljava/io/InputStream;Z)V at org.apache.spark.io.LZ4CompressionCodec.compressedInputStream(CompressionCodec.scala:122) at org.apache.spark.serializer.SerializerManager.wrapForCompression(SerializerManager.scala:163) at org.apache.spark.serializer.SerializerManager.wrapStream(SerializerManager.scala:124) at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50) at org.apache.spark.shuffle.BlockStoreShuffleReader$$anonfun$2.apply(BlockStoreShuffleReader.scala:50) at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:417) at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:61) at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435) at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32) at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.sort_addToSorter$(Unknown Source) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) at org.apache.spark.sql.execution.GroupedIterator$.apply(GroupedIterator.scala:29) at org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec$StateStoreUpdater.updateStateForKeysWithData(FlatMapGroupsWithStateExec.scala:176)**

尝试了一些方法来排除 net.jpountz.lz4 jar(以及其他帖子的建议),但输出中再次出现同样的错误.

Tried couple of things to exclude net.jpountz.lz4 jar( with suggestions from other posts) but again same error in output.

目前使用 spark 2.3,scalatest 3.0.5,Scala 2.11 版本.我只有在升级到 spark 2.3 和 scalatest 3.0.5 后才看到这个问题

Currently using spark 2.3 , scalatest 3.0.5, Scala 2.11 version . i see this issue only after upgrade to spark 2.3 and scalatest 3.0.5

有什么建议吗?

推荐答案

Kafka 与 Spark 的依赖存在冲突,这就是我遇到此问题的原因.

Kafka has a conflicting dependency with Spark and that's what caused this issue for me.

这就是你可以排除依赖的方法在你的 sbt 文件中

This is how you can exclude the dependency in you sbt file

lazy val excludeJpountz = ExclusionRule(organization = "net.jpountz.lz4", name = "lz4")

lazy val kafkaClients = "org.apache.kafka" % "kafka-clients" % userKafkaVersionHere excludeAll(excludeJpountz) // add more exclusions here

当您使用此 kafkaClients 依赖项时,它现在将排除有问题的 lz4 库.

When you use this kafkaClients dependency it would now exclude the problematic lz4 library.

更新:这似乎是 Kafka 0.11.x.x 和更早版本的问题.从 1.x.x 开始,Kafka 似乎已经不再使用有问题的 net.jpountz.lz4 库.因此,使用最新的 Kafka (1.x) 和最新的 Spark (2.3.x) 应该不会有这个问题.

Update: This appears to be an issue with Kafka 0.11.x.x and earlier version. As of 1.x.x Kafka seems to have moved away from using the problematic net.jpountz.lz4 library. Therefore, using latest Kafka (1.x) with latest Spark (2.3.x) should not have this issue.

这篇关于SBT 测试错误:java.lang.NoSuchMethodError:net.jpountz.lz4.LZ4BlockInputStream的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆