SPARK-java.lang.ClassNotFoundException [英] SPARK - java.lang.ClassNotFoundException
问题描述
我正在尝试使用spark-submit运行jar文件.
I'm trying to run jar file using spark-submit.
han-ui-MacBook-Air:spark-2.2.0-bin-hadoop2.7 Alphahacker$ ./bin/spark-submit --class SimpleApp --master local[*] /Users/Alphahacker/IdeaProjects/sparksimpletest/target/spark-simple-test-1.0.jar
但是我收到如下错误消息:
But I got error message as below:
java.lang.ClassNotFoundException: SimpleApp
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:712)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
我的IntelliJ屏幕如下所示: IntelliJ中的程序结构 ,以及该项目的pom.xml如下:
My IntelliJ screen looks like this: Program structure in IntelliJ , And pom.xml of this project as follows:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>alpha.spark.test</groupId>
<artifactId>spark-simple-test</artifactId>
<version>1.0</version>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.2</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>spark-streaming-twitter_2.10:spark-streaming-twitter_2.10</artifact>
<includes>
<include>**</include>
</includes>
</filter>
<filter>
<artifact>twitter4j-stream:twitter4j-stream</artifact>
<includes>
<include>**</include>
</includes>
</filter>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
程序代码如下:
import org.apache.spark.{SparkContext, SparkConf}
object SimpleApp {
def main(args: Array[String]) : Unit = {
val conf = new SparkConf().setAppName("SimpleApp").setMaster(args(0))
val sc = new SparkContext(conf)
val filePath = "/Users/Alphahacker/Desktop/README.md"
val inputRDD = sc.textFile(filePath)
val matchTerm = "spark"
val numMatches = inputRDD.filter(_.contains(matchTerm)).count()
println("%s lines in %s contains %s".format(numMatches, filePath, matchTerm))
System.exit(0)
}
}
我不知道怎么了. 如上图所示,我编写了不带包的SimpleApp对象,并使用Maven构建了项目.而且我以为我正确地执行了命令.
I don't know what is wrong. As you can see in the picture above, I wrote SimpleApp object without package and I built the project using Maven. And I thought I made the command properly.
请帮助我.
推荐答案
您必须使用此文件夹结构src/main/java/example.java
将Java文件保存在目录中,并使用maven进行构建.这种方法解决了我的错误,该错误与您的错误非常相似.
you have to save your java files in your directory with this folder structure src/main/java/example.java
and build with maven. This method solves my error which is very similar to yours.
这篇关于SPARK-java.lang.ClassNotFoundException的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!