Spark 2.3.0 netty 版本问题:NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric() [英] Spark 2.3.0 netty version issue: NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()

查看:30
本文介绍了Spark 2.3.0 netty 版本问题:NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我刚刚将我的 spark 项目从 2.2.1 升级到 2.3.0,以找到下面的版本控制异常.我依赖于来自 datastax 的 spark-cassandra-connector.2.0.7 和 cassandra-driver-core.3.4.0,而后者又依赖于 netty 4.x 而 spark 2.3.0 使用 3.9.x.

I just upgraded my spark project from 2.2.1 to 2.3.0 to find the versioning exception below. I have dependencies on the spark-cassandra-connector.2.0.7 and cassandra-driver-core.3.4.0 from datastax which in turn have dependencies on netty 4.x whereas spark 2.3.0 uses 3.9.x.

引发异常的类 org.apache.spark.network.util.NettyMemoryMetrics 已在 spark 2.3.0 中引入.

The class raising the exception, org.apache.spark.network.util.NettyMemoryMetrics, has been introduced in spark 2.3.0.

降级我的 Cassandra 依赖项是解决异常的唯一方法吗?谢谢!

Is downgrading my Cassandra dependencies the only way round the exception? Thanks!

Exception in thread "main" java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.<init>(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.<init>(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.<init>(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423)

推荐答案

您似乎使用了太旧"的 netty 4 版本.也许你的类路径上有多个?类路径上有 netty 4.x 和 3.x 应该没有问题.

It seems like you use an "too old" netty 4 version. Maybe you have multiple on your classpath ? It should be not problem to have netty 4.x and 3.x on the classpath.

这篇关于Spark 2.3.0 netty 版本问题:NoSuchMethod io.netty.buffer.PooledByteBufAllocator.metric()的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆