想法sbt java.lang.NoClassDefFoundError:org/apache/spark/SparkConf [英] idea sbt java.lang.NoClassDefFoundError: org/apache/spark/SparkConf

查看:167
本文介绍了想法sbt java.lang.NoClassDefFoundError:org/apache/spark/SparkConf的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我是spark的初学者.我使用"linux + idea + sbt"构建环境,尝试快速启动Spark时遇到问题:

I'm a beginner of spark.I build an environment use "linux + idea + sbt" ,when I try the quick start of Spark,I get the problem:

    Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
    at test$.main(test.scala:11)
    at test.main(test.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more

它们在磁盘中的版本:

sbt   = 0.13.11
jdk   = 1.8
scala = 2.10
idea  = 2016

我的目录结构:

test/
  idea/
  out/
  project/
    build.properties    
    plugins.sbt
  src/
    main/
      java/
      resources/
      scala/
      scala-2.10/
        test.scala
  target/
  assembly.sbt
  build.sbt

在build.properties中:

In build.properties:

sbt.version = 0.13.8

在plugins.sbt中:

In plugins.sbt:

logLevel := Level.Warn

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")

在build.sbt中:

In build.sbt:

import sbt._
import Keys._
import sbtassembly.Plugin._
import AssemblyKeys._

name := "test"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1" % "provided"

在Assembly.sbt中:

In assembly.sbt:

import AssemblyKeys._ // put this at the top of the file

assemblySettings

在test.scala中:

In test.scala:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object test {
  def main(args: Array[String]) {
    val logFile = "/opt/spark-1.6.1-bin-hadoop2.6/README.md" // Should be some file on your system
    val conf = new SparkConf().setAppName("Test Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    val numAs = logData.filter(line => line.contains("a")).count()
    val numBs = logData.filter(line => line.contains("b")).count()
    println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
  }
}

我该如何解决这个问题.

How can I solve this problem.

推荐答案

范围为提供" 的依赖项仅在编译和测试期间可用,而在运行时或打包时不可用.因此,您应该使它成为放置在 src/test/scala 中的实际测试套件,而不是使用 main 来创建对象 test (如果您不熟悉Scala中的单元测试,例如,建议使用ScalaTest.首先在您的build.sbt中添加对它的依赖: libraryDependencies + ="org.scalatest" %%"scalatest%" 2.2.4%测试,然后进行此快速入门教程来实施一个简单的规范).

Dependencies with "provided" scope are only available during compilation and testing, and are not available at runtime or for packaging. So, instead of making an object test with a main, you should make it an actual test suite placed in src/test/scala (If you're not familiar with unit-testing in Scala, I'd suggest to use ScalaTest, for example. First add a dependency on it in your build.sbt: libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test and then go for this quick start tutorial to implement a simple spec).

在我看来,另一种方法非常棘手(但是仍然可以解决问题),它涉及在某些配置中从 spark-core 依赖项中删除提供的范围.在此问题的已接受答案中进行了描述.

Another option, which is quite hacky, in my opinion (but does the trick nonetheless), involves removing provided scope from your spark-core dependency in some configurations and is described in the accepted answer to this question.

这篇关于想法sbt java.lang.NoClassDefFoundError:org/apache/spark/SparkConf的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆