火花提交 java.lang.ClassNotFoundException [英] spark submit java.lang.ClassNotFoundException

查看:21
本文介绍了火花提交 java.lang.ClassNotFoundException的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试运行我自己的 spark 应用程序,但是当我使用 spark-submit 命令时出现以下错误:

I'm trying to run my own spark application but when I'm using the spark-submit command I get this error:

Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar --stacktrace
java.lang.ClassNotFoundException:        /Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:340)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:633)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

我正在使用以下命令:

/Users/_name_here/dev/spark/bin/spark-submit 
--class "/Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp" 
--master local[4] /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar 

我的 build.sb 看起来像这样:

My build.sb looks like this:

name := "mo"

version := "1.0"

scalaVersion := "2.10.4"


libraryDependencies ++= Seq(
  "org.apache.spark"          % "spark-core_2.10"   %    "1.4.0",
  "org.postgresql"            % "postgresql"        %    "9.4-1201-jdbc41",
  "org.apache.spark"          % "spark-sql_2.10"    %    "1.4.0",
  "org.apache.spark"          % "spark-mllib_2.10"  %    "1.4.0",
  "org.tachyonproject"        % "tachyon-client"    %    "0.6.4",
  "org.postgresql"            % "postgresql"        %    "9.4-1201-jdbc41",
  "org.apache.spark"          % "spark-hive_2.10"   %    "1.4.0",
  "com.typesafe"              % "config"            %    "1.2.1"
)

resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"

我的插件.sbt:

logLevel := Level.Warn

resolvers += "Sonatype snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"

addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly"  %"0.11.2")

我正在使用 spark.apache.org 的 prebuild 包.我通过 brew 和 scala 安装了 sbt.从 spark 根文件夹运行 sbt 包工作正常,它创建了 jar,但使用程序集根本不起作用,可能是因为它在重建 spark 文件夹中丢失.我会很感激任何帮助,因为我对火花很陌生.哦,顺便说一句,spark 在 IntelliJ 中运行良好

I'm using the prebuild package from spark.apache.org. I installed sbt through brew as well as scala. Running sbt package from the spark root folder works fine and it creates the jar but using assembly doesn't work at all, maybe because its missing in the rebuild spark folder. I would appreciate any help because I'm quite new to spark. oh and btw spark is running fine within intelliJ

推荐答案

显然我的项目结构总体上一定有问题.因为我用 sbt 和 sublime 创建了一个新项目,现在我可以使用 spark-submit.但这真的很奇怪,因为我没有对 intelliJ 中提供的 sbt-project 的默认结构进行任何更改.这是现在的项目结构,就像一个魅力:

Apparently there must have been something wrong with my project structure in general. Because I created a new project with sbt and sublime and I'm now able to use spark-submit. But this is really weird because I haven't changed anything to the default structure of a sbt-project provided in intelliJ. This is now the project structure which works like a charm:

Macbook:sp user$ find .
.
./build.sbt
./project
./project/plugin.sbt
./src
./src/main
./src/main/scala
./src/main/scala/MySimpleApp.scala

感谢您的帮助!

这篇关于火花提交 java.lang.ClassNotFoundException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆