如何在Intellij中运行Spark scala应用程序 [英] How to run Spark scala application inside Intellij

查看:189
本文介绍了如何在Intellij中运行Spark scala应用程序的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试在Hortonworks沙箱上使用Intellij运行一个简单的Spark应用程序。
我开了一个新的SBT项目,然后创建了一个Scala类:

I'm trying to run a simple Spark application using Intellij on Hortonworks sandbox. I've opened a new SBT project, then created a Scala class:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf

object SimpleApp {
  def main(args: Array[String]) {
    val logFile = "/root/temp.txt"
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    println(logData .count())
  }
}

这是我的build.sbt:

This is my the build.sbt:

name := "Simple Project"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core" % "1.3.0" % "provided"

现在右键单击此类 - >运行抛出异常:

Now right clicking on this class -> run throws exception :

exception in thread main java.lang.noclassdeffounderror: org/apache/Spark/SparkConf

显然我做错了,但我可以在依赖列表中看到spark库。有帮助吗? (BTW通过SBT Scala控制台运行该程序非常有效)

Obviously I'm doing something wrong, but I can see spark libraries on the dependencies list. Any help? (BTW running this program through SBT Scala console works perfectly)

推荐答案

运行

object SimpleApp extends App {
  def main(args: Array[String]) {
    val logFile = "/root/temp.txt"
    val conf = new SparkConf().setAppName("Simple Application")
    val sc = new SparkContext(conf)
    val logData = sc.textFile(logFile, 2).cache()
    println(logData .count())
  }
}

这篇关于如何在Intellij中运行Spark scala应用程序的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆