sbt-对象apache不是包org的成员 [英] sbt - object apache is not a member of package org
本文介绍了sbt-对象apache不是包org的成员的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想使用sbt部署并提交一个spark程序,但抛出错误.
I want to deploy and submit a spark program using sbt but its throwing error.
代码:
package in.goai.spark
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("First Spark")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
build.sbt
name := "First Spark"
version := "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += Resolver.mavenLocal
在第一个/项目目录下
build.properties
bt.version=0.13.9
当我尝试运行sbt package
时,其投掷错误如下所示.
When I am trying to run sbt package
its throwing error given below.
[root@hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("First Spark")
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error] val sc = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM
我也尝试过使用extends
到App
,但是没有任何变化.
I have tried with extends
to App
too but no change.
推荐答案
请从build.sbt
中删除resolvers += Resolver.mavenLocal
.由于spark-core
在Maven上可用,因此我们不需要使用本地解析器.
Please remove resolvers += Resolver.mavenLocal
from build.sbt
. Since spark-core
is available on Maven, we don't need to use local resolvers.
之后,您可以尝试sbt clean package
.
这篇关于sbt-对象apache不是包org的成员的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文