火花提交java.lang.ClassNotFoundException [英] spark submit java.lang.ClassNotFoundException
问题描述
我正在尝试运行自己的spark应用程序,但是当我使用spark-submit命令时,出现此错误:
I'm trying to run my own spark application but when I'm using the spark-submit command I get this error:
Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar --stacktrace
java.lang.ClassNotFoundException: /Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:340)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:633)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
我正在使用以下命令:
/Users/_name_here/dev/spark/bin/spark-submit
--class "/Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp"
--master local[4] /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar
我的build.sb看起来像这样:
My build.sb looks like this:
name := "mo"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.4.0",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-sql_2.10" % "1.4.0",
"org.apache.spark" % "spark-mllib_2.10" % "1.4.0",
"org.tachyonproject" % "tachyon-client" % "0.6.4",
"org.postgresql" % "postgresql" % "9.4-1201-jdbc41",
"org.apache.spark" % "spark-hive_2.10" % "1.4.0",
"com.typesafe" % "config" % "1.2.1"
)
resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
我的plugin.sbt:
My plugin.sbt:
logLevel := Level.Warn
resolvers += "Sonatype snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" %"0.11.2")
我正在使用spark.apache.org中的prebuild包.我通过brew和scala安装了sbt.从spark根文件夹运行sbt程序包可以正常工作,它可以创建jar,但是使用Assembly根本不起作用,可能是因为它在rebuild spark文件夹中丢失了.我们将不胜感激,因为我刚起步.哦,btw spark在intelliJ中运行正常
I'm using the prebuild package from spark.apache.org. I installed sbt through brew as well as scala. Running sbt package from the spark root folder works fine and it creates the jar but using assembly doesn't work at all, maybe because its missing in the rebuild spark folder. I would appreciate any help because I'm quite new to spark. oh and btw spark is running fine within intelliJ
推荐答案
显然,我的项目结构总体上应该出了点问题.因为我用sbt和sublime创建了一个新项目,所以现在可以使用spark-submit了.但这真的很奇怪,因为我没有对intelliJ中提供的sbt-project的默认结构进行任何更改.现在这是一个项目结构,就像一个魅力:
Apparently there must have been something wrong with my project structure in general. Because I created a new project with sbt and sublime and I'm now able to use spark-submit. But this is really weird because I haven't changed anything to the default structure of a sbt-project provided in intelliJ. This is now the project structure which works like a charm:
Macbook:sp user$ find .
.
./build.sbt
./project
./project/plugin.sbt
./src
./src/main
./src/main/scala
./src/main/scala/MySimpleApp.scala
感谢您的帮助!
这篇关于火花提交java.lang.ClassNotFoundException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!