尝试使用Cloudera Spark Tutorial将无法使用“classnotfoundexception”。 [英] Trying out Cloudera Spark Tutorial won't work "classnotfoundexception"

查看:189
本文介绍了尝试使用Cloudera Spark Tutorial将无法使用“classnotfoundexception”。的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在类似的现有帖子中尝试过解决方案,但没有一个对我有用:-(变得非常无望,所以我决定将此作为一个新问题发布。

I tried solutions suggested in similar existing post but none works for me :-( getting really hopeless, so I decided to post this as a new question.

I尝试了一个关于在Cloudera VM中使用Spark构建第一个scala或java应用程序的教程(下面的链接)。

I tried a tutorial (link below) on building a first scala or java application with Spark in a Cloudera VM.

这是我的spark-submit命令及其输出

this is my spark-submit command and its output

[cloudera@quickstart sparkwordcount]$ spark-submit --class com.cloudera.sparkwordcount.SparkWordCount --master local  /home/cloudera/src/main/scala/com/cloudera/sparkwordcount/target/sparkwordcount-0.0.1-SNAPSHOT.jar
java.lang.ClassNotFoundException: com.cloudera.sparkwordcount.SparkWordCount
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.spark.util.Utils$.classForName(Utils.scala:176)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[cloudera@quickstart sparkwordcount]$ spark-submit --class com.cloudera.sparkwordcount.SparkWordCount --master local  /home/cloudera/src/main/scala/com/cloudera/sparkwordcount/target/sparkwordcount-0.0.1-SNAPSHOT.jar

我也尝试过更新我的实际CDH,Spark的pom.xml文件和Scala版本但仍然不能正常工作。

I also tried updating the pom.xml file with my actual CDH, Spark and Scala versions but still not working.

当我使用mvn包解压缩之前由maven生成的jar文件时,我在其hiearachy文件夹中找不到任何.class文件。

When I extract the jar file previously generated by maven using mvn package, I cannot find any .class file inside its hiearachy of folders.

抱歉,我对Cloudera和Spark有点新意。我基本上尝试使用Scala遵循以下教程: https://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/

Sorry, I am bit new to Cloudera and Spark. I basically tried following the following tutorial with Scala: https://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/

我非常仔细地检查了类,文件夹和scala文件名,特别是大写/大写问题,似乎没有错。

I checked the class, folder and scala file names quite a few names very closely, specially lower/uppercase issues, nothing seemed wrong.

我打开了我的jar并且有一些文件层次结构,在最深的文件夹中我可以再次找到pom.xml文件,但我在jar中的任何地方都看不到任何.class文件。这是否意味着通过mvn package进行的编译实际上并不起作用,即使控制台输出表示Building成功了?

I opened my jar and there is some file hierarchy and in the deepest folder I can find again the pom.xml file, but I cannot see any .class files anywhere inside the jar. Does it mean the compilation via "mvn package" didn't actually work, even though the console output said Building went successful?

推荐答案

我遇到了同样的问题。尝试通过更改类名来重新运行

I was having same issue. Try rerunning by changing class name from

--class com.cloudera.sparkwordcount.SparkWordCount

--class SparkWordCount

我使用的完整命令如下所示:

The full command i used looked like:

spark-submit --class SparkWordCount --master local --deploy-mode client --executor-memory 1g --name wordcount --conf "spark.app.id=wordcount" target/sparkwordcount-0.0.1-SNAPSHOT.jar /user/cloudera/inputfile.txt 2

这篇关于尝试使用Cloudera Spark Tutorial将无法使用“classnotfoundexception”。的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆