如何运行IntelliJ IDEA的火花示例程序 [英] How to run a spark example program in Intellij IDEA
问题描述
首先,从下载的星火项目,我跑的根在命令行上
First on the command line from the root of the downloaded spark project I ran
mvn package
这是成功的。
然后,一个项目的IntelliJ通过导入火花pom.xml中创建的。
Then an intellij project was created by importing the spark pom.xml.
在IDE中的示例类显示正常:所有的库中找到。这可以在屏幕截图中查看。
In the IDE the example class appears fine: all of the libraries are found. This can be viewed in the screenshot.
但是,试图运行在main()上SparkContext一个ClassNotFoundException时发生。
However , when attempting to run the main() a ClassNotFoundException on SparkContext occurs.
为什么不的IntelliJ简单地加载和运行此Maven的基于Scala程序?什么可以作为解决办法呢?
Why can Intellij not simply load and run this maven based scala program? And what can be done as a workaround?
正如人们可以看到下面的SparkContext正在寻找在IDE罚款:但后来找不到试图运行时:
As one can see below, the SparkContext is looking fine in the IDE: but then is not found when attempting to run:
该测试通过右键单击内部运行的main():
The test was run by right clicking inside main():
..并选择Run GroupByTest
.. and selecting Run GroupByTest
它给
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
at org.apache.spark.examples.GroupByTest$.main(GroupByTest.scala:36)
at org.apache.spark.examples.GroupByTest.main(GroupByTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkContext
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
下面是运行配置:
推荐答案
星火lib目录是不是你的CLASS_PATH。
Spark lib isn't your class_path.
执行 SBT / SBT组装
,
和包括/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar后,您的项目。
and after include "/assembly/target/scala-$SCALA_VERSION/spark-assembly*hadoop*-deps.jar" to your project.
这篇关于如何运行IntelliJ IDEA的火花示例程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!