将Scopt OptionParser与Spark结合使用时发生NoClassDefFoundError [英] NoClassDefFoundError while using scopt OptionParser with Spark

查看:471
本文介绍了将Scopt OptionParser与Spark结合使用时发生NoClassDefFoundError的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在使用Apache Spark 1.2.1版和Scala 2.10.4版.我正在尝试获取示例 MovieLensALS 工作.但是,我遇到了 scopt 库的错误,这是代码中的要求.任何帮助,将不胜感激. 我的build.sbt如下:

I am using Apache Spark version 1.2.1 and Scala version 2.10.4. I am trying to get the example MovieLensALS working. However, I am running into errors with scopt library which is a requirement in the code. Any help would be appreciated. My build.sbt is as follows:

name := "Movie Recommender System"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.1"

libraryDependencies += "org.apache.spark"  % "spark-mllib_2.10" % "1.2.1"

libraryDependencies += "com.github.scopt" %% "scopt" % "3.2.0"

resolvers += Resolver.sonatypeRepo("public")

和我得到的错误如下:

   Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
    at MovieLensALS.main(MovieLensALS.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Caused by: java.lang.ClassNotFoundException: scopt.OptionParser
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
    ... 8 more
    Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

在运行sbt程序集构建jar时,出现以下错误:

On running sbt assembly to build the jar, I receive the following errors:

[error] Not a valid command: assembly
[error] Not a valid project ID: assembly
[error] Expected ':' (if selecting a configuration)
[error] Not a valid key: assembly
[error] assembly
[error]         ^

根据Justin Piphony的建议,sbt的 GitHub <中列出的解决方案/a>页有助于解决此错误.基本上在project/目录中创建一个文件assembly.sbt并添加该行 addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

As per Justin Piphony's suggestion the solution that was listed in sbt's GitHub page helped fix this error. Basically creating a file assembly.sbt in the project/ directory and adding the line addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.13.0")

请注意,应根据使用的版本添加版本.

Note that the version should be added according to the version in use.

推荐答案

您需要将scopt包装在jar中. sbt默认情况下不执行此操作.要创建这个胖子罐,您需要使用 sbt-assembly

You need to package scopt in your jar. sbt doesn't do this by default. To create this fat jar, you need to use sbt-assembly

这篇关于将Scopt OptionParser与Spark结合使用时发生NoClassDefFoundError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆