来自Janino和Commons-Compiler的Spark java.lang.NoSuchMethodError [英] Spark java.lang.NoSuchMethodError From Janino and Commons-Compiler

查看:472
本文介绍了来自Janino和Commons-Compiler的Spark java.lang.NoSuchMethodError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在构建一个使用Spark进行随机森林分类的​​应用程序。
尝试运行此程序时,我从行中获得异常:
StringIndexerModel labelIndexer = new StringIndexer()。setInputCol(label)。setOutputCol(indexedLabel) .fit(data);

I am building an application that uses Spark for Random Forest based classification. When trying to run this program I am getting an exception from the line: StringIndexerModel labelIndexer = new StringIndexer().setInputCol("label").setOutputCol("indexedLabel").fit(data);

看起来代码以某种方式达到Janino版本2.7.8,虽然我明白我需要3.0.7 。
我不知道如何正确设置依赖项以强制构建使用正确的版本。它似乎总是试图使用2.7.8。

It looks like that the code somehow reaches Janino version 2.7.8, although I understand I need 3.0.7. I have no idea how to set the dependencies correctly in order to force the build to use the correct version. It seems that it always tries to use 2.7.8.

是否有可能以某种方式清理缓存?

Is it possible that somehow I ned to clean the cache?

以下是来自 gradle依赖项的行

+--- org.codehaus.janino:janino:3.0.7 -> 2.7.8
|    +--- org.codehaus.janino:commons-compiler:3.0.7

Gradle部分定义依赖项:

The Gradle section defining the dependencies:

dependencies {
  compile('org.apache.hadoop:hadoop-mapreduce-client-core:2.7.2') { force = true }
  compile('org.apache.hadoop:hadoop-common:2.7.2') { force = true }
  // https://mvnrepository.com/artifact/org.codehaus.janino/janino
  compile (group: 'org.codehaus.janino', name: 'janino', version: '3.0.7') {
    force = true
    exclude group: 'org.codehaus.janino', module: 'commons-compiler'
  }
  // https://mvnrepository.com/artifact/org.codehaus.janino/commons-compiler
  compile (group: 'org.codehaus.janino', name: 'commons-compiler', version: '3.0.7') {
    force = true
    exclude group: 'org.codehaus.janino', module: 'janino'
  }
  // https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.11
  compile (group: 'org.apache.spark', name: 'spark-sql_2.11', version: '2.2.0') {
    exclude group: 'org.codehaus.janino', module: 'janino'
    exclude group: 'org.codehaus.janino', module: 'commons-compiler'
  }
  // https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
  compile (group: 'org.apache.spark', name: 'spark-core_2.11', version: '2.2.0') {
    exclude group: 'org.codehaus.janino', module: 'janino'
    exclude group: 'org.codehaus.janino', module: 'commons-compiler'
  }
  // https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.11
  compile (group: 'org.apache.spark', name: 'spark-mllib_2.11', version: '2.2.0') {
    exclude group: 'org.codehaus.janino', module: 'janino'
    exclude group: 'org.codehaus.janino', module: 'commons-compiler'
  }
  // https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-databind
  runtime group: 'com.fasterxml.jackson.core', name: 'jackson-databind', version: '2.6.5'
  // https://mvnrepository.com/artifact/com.fasterxml.jackson.module/jackson-module-scala_2.11
  runtime group: 'com.fasterxml.jackson.module', name: 'jackson-module-scala_2.11', version: '2.6.5'
  compile group: 'com.google.code.gson', name: 'gson', version: '2.8.1'
  compile group: 'org.apache.logging.log4j', name: 'log4j-api', version: '2.4.1'
  compile group: 'org.apache.logging.log4j', name: 'log4j-core', version: '2.4.1'
  testCompile 'org.testng:testng:6.9.4'
  testCompile 'org.mockito:mockito-core:1.10.19'
}

例外:

Exception in thread "main" java.lang.NoSuchMethodError: org.codehaus.commons.compiler.Location.<init>(Ljava/lang/String;SS)V
    at org.codehaus.janino.Scanner.location(Scanner.java:261)
    at org.codehaus.janino.Parser.location(Parser.java:2742)
    at org.codehaus.janino.Parser.parseImportDeclarationBody(Parser.java:209)
    at org.codehaus.janino.ClassBodyEvaluator.makeCompilationUnit(ClassBodyEvaluator.java:255)
    at org.codehaus.janino.ClassBodyEvaluator.cook(ClassBodyEvaluator.java:222)
    at org.codehaus.janino.SimpleCompiler.cook(SimpleCompiler.java:192)
    at org.codehaus.commons.compiler.Cookable.cook(Cookable.java:80)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.org$apache$spark$sql$catalyst$expressions$codegen$CodeGenerator$$doCompile(CodeGenerator.scala:960)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1027)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$$anon$1.load(CodeGenerator.scala:1024)
    at org.spark_project.guava.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3599)
    at org.spark_project.guava.cache.LocalCache$Segment.loadSync(LocalCache.java:2379)
    at org.spark_project.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2342)
    at org.spark_project.guava.cache.LocalCache$Segment.get(LocalCache.java:2257)
    at org.spark_project.guava.cache.LocalCache.get(LocalCache.java:4000)
    at org.spark_project.guava.cache.LocalCache.getOrLoad(LocalCache.java:4004)
    at org.spark_project.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4874)
    at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator$.compile(CodeGenerator.scala:906)
    at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:375)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
    at org.apache.spark.sql.execution.DeserializeToObjectExec.doExecute(objects.scala:95)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92)
    at org.apache.spark.sql.Dataset.rdd$lzycompute(Dataset.scala:2581)
    at org.apache.spark.sql.Dataset.rdd(Dataset.scala:2578)
    at org.apache.spark.ml.feature.StringIndexer.fit(StringIndexer.scala:111)


推荐答案

可能你已经解决了这个问题,我今天也遇到了同样的错误。但是我不明白为什么你做了这些排除,他们似乎对我不对:

May be you have solved the problem already, I also ran into same error today. However I did not understand why you did these exclusions and they don't seem right to me:

  // https://mvnrepository.com/artifact/org.codehaus.janino/janino
  compile (group: 'org.codehaus.janino', name: 'janino', version: '3.0.7') {
    force = true
    exclude group: 'org.codehaus.janino', module: 'commons-compiler'
  }
  // https://mvnrepository.com/artifact/org.codehaus.janino/commons-compiler
  compile (group: 'org.codehaus.janino', name: 'commons-compiler', version: '3.0.7') {
    force = true
    exclude group: 'org.codehaus.janino', module: 'janino'
  }

我们只是需要从 org.apache.spark中排除 org.codehaus.janino:commons-compiler :spark-mllib_2.11 (其他spark依赖关系已经作为mllib的传递依赖存在,没有必要添加它们或从它们中排除commons-compiler偶然)然后包括 org.codehaus.janino:commons-compiler:3.0.7 返回。

We just needed to exclude org.codehaus.janino:commons-compiler from org.apache.spark:spark-mllib_2.11 (other spark dependencies are already present as transitive dependencies to mllib, there is no need to add them or exclude commons-compiler from them individually) and then include org.codehaus.janino:commons-compiler:3.0.7 back.

这是来自工作项目的依赖块。我的项目是用Maven构建的,但我确定任何人都可以将其转换为Gradle等价物。

Here's the dependency block from a working project. My project was built with Maven, but I'm sure anyone can convert this to Gradle equivalent.

<!--Spark Libraries-->
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
    <!--Dropping Logger Dependencies-->
    <exclusions>
        <exclusion>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
        </exclusion>
        <exclusion>
            <groupId>log4j</groupId>
            <artifactId>log4j</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-mllib_2.11</artifactId>
    <version>${spark.version}</version>
    <scope>provided</scope>
    <!--Dropping commons-compiler-->
    <exclusions>
        <exclusion>
            <groupId>org.codehaus.janino</groupId>
            <artifactId>commons-compiler</artifactId>
        </exclusion>
    </exclusions>
</dependency>

<dependency>
    <groupId>org.codehaus.janino</groupId>
    <artifactId>commons-compiler</artifactId>
    <version>3.0.8</version>
</dependency>

注意:使用spring boot release时,commons-compiler 2.7.8对我来说也很好用版本和Elasticsearch 2.4。在我们升级到spring boot里程碑版本2.0.0.M7和Elasticsearch 5.6后,只需升级到3.0.8。

Note: commons-compiler 2.7.8 was working fine for me as well when using with spring boot release version and Elasticsearch 2.4. Only had to upgrade to 3.0.8 after we upgraded to spring boot milestone version 2.0.0.M7 and Elasticsearch 5.6.

这篇关于来自Janino和Commons-Compiler的Spark java.lang.NoSuchMethodError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆