星火1.5.1,卡桑德拉连接器1.5.0-M2,卡珊德拉2.1,斯卡拉2.10,番石榴的NoSuchMethodError依赖 [英] Spark 1.5.1, Cassandra Connector 1.5.0-M2, Cassandra 2.1, Scala 2.10, NoSuchMethodError guava dependency

查看:281
本文介绍了星火1.5.1,卡桑德拉连接器1.5.0-M2,卡珊德拉2.1,斯卡拉2.10,番石榴的NoSuchMethodError依赖的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

新建星火环境(和相当新的Maven),所以我用了如何发送我需要正确的依赖挣扎。

它看起来像星火1.5.1有它尝试使用,并在15岁以上,加入了一个isPrimitive番石榴-14.0.1依赖。什么是保证我的尤伯杯罐子胜的正确方法?我试过 spark.executor.extraClassPath 在我的火花defaults.conf无济于事。

复制到这个[提问]:<一href=\"http://stackoverflow.com/questions/33583183/spark-1-5-1-scala-2-10-kafka-cassandra-java-lang-nosuchmethoderror\">Spark 1.5.1 +斯卡拉2.10 +卡夫卡+卡桑德拉= Java.lang.NoSuchMethodError:但Maven的本质(没有代表尚未就此发表评论)

我的依赖性剥离下来到这一点:

 &LT;&依赖性GT;
        &LT;&的groupId GT; com.google.guava&LT; /的groupId&GT;
        &LT;&的artifactId GT;番石榴&LT; / artifactId的&GT;
        &LT;&版GT; 18.0&LT; /版本&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; org.apache.commons&LT; /的groupId&GT;
        &LT;&的artifactId GT;公共-COM preSS&LT; / artifactId的&GT;
        &LT;&版GT; 1.10; /版本&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; com.esotericsoftware.kryo&LT; /的groupId&GT;
        &LT;&的artifactId GT;&KRYO LT; / artifactId的&GT;
        &LT;&版GT;&2.21 LT; /版本&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; org.objenesis&LT; /的groupId&GT;
        &LT;&的artifactId GT; objenesis&LT; / artifactId的&GT;
        &LT;&版GT; 2.1&LT; /版本&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
        &LT;&的artifactId GT;火花core_2.10&LT; / artifactId的&GT;
        &LT;&版GT; 1.5.0&LT; /版本&GT;
        &LT;&排除GT;
            &LT;&排斥GT;
                &LT;&的groupId GT; org.slf4j&LT; /的groupId&GT;
                &LT;&的artifactId GT; SLF4J-log4j12&LT; / artifactId的&GT;
            &LT; /排除&GT;
            &LT;&排斥GT;
                &LT;&的groupId GT;的log4j&LT; /的groupId&GT;
                &LT;&的artifactId GT;的log4j&LT; / artifactId的&GT;
            &LT; /排除&GT;
        &LT; /排除&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; org.apache.spark&LT; /的groupId&GT;
        &LT;&的artifactId GT;火花sql_2.10&LT; / artifactId的&GT;
        &LT;&版GT; 1.5.0&LT; /版本&GT;
    &LT; /依赖性&GT;
    &LT;&依赖性GT;
        &LT;&的groupId GT; com.datastax.spark&LT; /的groupId&GT;
        &LT;&的artifactId GT;火花卡桑德拉 - connector_2.10&LT; / artifactId的&GT;
        &LT;&版GT; 1.5.0-M2&LT; /版本&GT;
    &LT; /依赖性&GT;

我的阴影JAR用这一切的依赖关系:

 &LT;&插件GT;
            &LT;&的groupId GT; org.apache.maven.plugins&LT; /的groupId&GT;
            &LT;&的artifactId GT; Maven的遮阳插件&LT; / artifactId的&GT;
            &LT;&版GT; 2.3&LT; /版本&GT;
            &LT;&执行GT;
                &LT;执行与GT;
                    &LT;阶段&gt;包装及LT; /阶段&gt;
                    &LT;目标&GT;
                        &LT;&目标GT;遮阳和LT; /目标&GT;
                    &LT; /目标&GT;
                    &LT;结构&gt;
                        &LT; artifactSet&GT;
                            &LT;&排除GT;
                                &LT;&排除GT; org.apache.hadoop:*&LT; /排除&GT;
                                &LT;&排除GT; org.apache.hbase:*&LT; /排除&GT;
                            &LT; /排除&GT;
                        &LT; / artifactSet&GT;
                        &LT;过滤器和GT;
                            &所述;滤光器&gt;
                                &LT;&神器GT; *:*&LT; /神器&GT;
                                &LT;&排除GT;
                                    &LT;排除方式&gt; META-INF / * SF&LT; /排除&GT;
                                    &LT;排除方式&gt; META-INF / * DSA&LT; /排除&GT;
                                    &LT;排除方式&gt; META-INF / * RSA&LT; /排除&GT;
                                &LT; /排除&GT;
                            &LT; /滤光器&gt;
                            &所述;滤光器&gt;
                                &LT;&神器GT; org.apache.spark:火花网络common_2.10&LT; /神器&GT;
                                &LT;&排除GT;
                                    &LT;&排除GT; com.google.common.base * LT; /排除&GT;
                                &LT; /排除&GT;
                            &LT; /滤光器&gt;
                        &LT; /过滤器&GT;
                        &LT;变压器&GT;
                            &LT;变压器
                                    实施=org.apache.maven.plugins.shade.resource.AppendingTransformer&GT;
                                &LT;! - 合并多个reference.conf文件合并成一个 - &GT;
                                &LT;资源&GT; reference.conf&LT; /资源&GT;
                            &LT; /变压器&GT;
                        &LT; /变压器&GT;
                    &LT; /结构&gt;
                &LT; /执行&GT;
            &LT; /处决&GT;
        &LT; /插件&GT;

下面是我真棒爆炸,当我运行

./火花提交--master本地--class&LT;我的主​​类&GT; &LT;我的阴影罐子&GT;

 异常线程mainjava.lang.NoSuchMethodError:com.google.common.reflect.TypeToken.isPrimitive()z
在com.datastax.driver.core.Type codeC&LT;&初始化GT;(输入codec.java:142)。
在com.datastax.driver.core.Type codeC&LT;&初始化GT;(输入codec.java:136)。
在com.datastax.driver.core.Type codeC $斑点codeC&LT;&初始化GT;(输入codec.java:609)。
在com.datastax.driver.core.Type codeC $斑点codeC&LT; clinit&GT;(输入codec.java:606)。
在com.datastax.driver.core codecRegistry&LT; clinit&GT;(codecRegistry.java:147)。
在com.datastax.driver.core.Configuration $ Builder.build(Configuration.java:259)
在com.datastax.driver.core.Cluster $ Builder.getConfiguration(Cluster.java:1135)
在com.datastax.driver.core.Cluster&LT;&初始化GT;(Cluster.java:111)
在com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
在com.datastax.driver.core.Cluster $ Builder.build(Cluster.java:1152)
在com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
在com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)


解决方案

固定我的依赖通过发行硬包括番石榴罐子我需要/conf/spark-defaults.conf。

  spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar
spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar

New to the Spark environment (and fairly new to Maven) so I'm struggling with how to send the dependencies I need correctly.

It looks like Spark 1.5.1 has a guava-14.0.1 dependency which it tries to use and the isPrimitive was added in 15+. What's the correct way to ensure my uber-jar wins? I've tried spark.executor.extraClassPath in my spark-defaults.conf to no avail.

Duplicate to this [question]:Spark 1.5.1 + Scala 2.10 + Kafka + Cassandra = Java.lang.NoSuchMethodError: but for Maven essentially (don't have rep to comment yet)

Stripped down my dependencies to this:

    <dependency>
        <groupId>com.google.guava</groupId>
        <artifactId>guava</artifactId>
        <version>18.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.commons</groupId>
        <artifactId>commons-compress</artifactId>
        <version>1.10</version>
    </dependency>
    <dependency>
        <groupId>com.esotericsoftware.kryo</groupId>
        <artifactId>kryo</artifactId>
        <version>2.21</version>
    </dependency>
    <dependency>
        <groupId>org.objenesis</groupId>
        <artifactId>objenesis</artifactId>
        <version>2.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.10</artifactId>
        <version>1.5.0</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-log4j12</artifactId>
            </exclusion>
            <exclusion>
                <groupId>log4j</groupId>
                <artifactId>log4j</artifactId>
            </exclusion>
        </exclusions>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.10</artifactId>
        <version>1.5.0</version>
    </dependency>
    <dependency>
        <groupId>com.datastax.spark</groupId>
        <artifactId>spark-cassandra-connector_2.10</artifactId>
        <version>1.5.0-M2</version>
    </dependency>

Shaded my JAR with all the dependencies using this:

       <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-shade-plugin</artifactId>
            <version>2.3</version>
            <executions>
                <execution>
                    <phase>package</phase>
                    <goals>
                        <goal>shade</goal>
                    </goals>
                    <configuration>
                        <artifactSet>
                            <excludes>
                                <exclude>org.apache.hadoop:*</exclude>
                                <exclude>org.apache.hbase:*</exclude>
                            </excludes>
                        </artifactSet>
                        <filters>
                            <filter>
                                <artifact>*:*</artifact>
                                <excludes>
                                    <exclude>META-INF/*.SF</exclude>
                                    <exclude>META-INF/*.DSA</exclude>
                                    <exclude>META-INF/*.RSA</exclude>
                                </excludes>
                            </filter>
                            <filter>
                                <artifact>org.apache.spark:spark-network-common_2.10</artifact>
                                <excludes>
                                    <exclude>com.google.common.base.*</exclude>
                                </excludes>
                            </filter>
                        </filters>
                        <transformers>
                            <transformer
                                    implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
                                <!-- merge multiple reference.conf files into one -->
                                <resource>reference.conf</resource>
                            </transformer>
                        </transformers>
                    </configuration>
                </execution>
            </executions>
        </plugin>

Here's my awesome explosion when I run

./spark-submit --master local --class <my main class> <my shaded jar>

Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z
at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)

解决方案

Fixed my dependency issue by hard-including the guava jars I needed in /conf/spark-defaults.conf.

spark.driver.extraClassPath /home/osboxes/Packages/guava-18.0.jar
spark.executor.extraClassPath /home/osboxes/Packages/guava-18.0.jar

这篇关于星火1.5.1,卡桑德拉连接器1.5.0-M2,卡珊德拉2.1,斯卡拉2.10,番石榴的NoSuchMethodError依赖的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆