为什么 Scala 编译器失败并显示“无法在包 org.apache.spark 中访问包 spark 中的对象 SparkConf"? [英] Why does Scala compiler fail with "object SparkConf in package spark cannot be accessed in package org.apache.spark"?

查看:18
本文介绍了为什么 Scala 编译器失败并显示“无法在包 org.apache.spark 中访问包 spark 中的对象 SparkConf"?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我无法访问包中的 SparkConf.但我已经导入了 import org.apache.spark.SparkConf.我的代码是:

I cannot access the SparkConf in the package. But I have already import the import org.apache.spark.SparkConf. My code is:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print() 

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the terminate

    }
}

sbt 依赖项是:

name := "Spark Streaming"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2"
)

但错误显示 SparkConf 无法访问.

But the error shows that SparkConf cannot be accessed.

[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error]         val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error]                        ^

推荐答案

SparkConf后面加括号就可以编译:

It compiles if you add parenthesis after SparkConf:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

关键是 SparkConf 是一个类而不是一个函数,因此您也可以将类名用于范围目的.因此,当您在类名后添加括号时,您确保调用的是类构造函数而不是作用域功能.这是一个来自 Scala shell 的示例,说明了不同之处:

The point is that SparkConf is a class and not a function, so you could use class name also for scope purposes. So when you add parenthesis after the class name, you are making sure you are calling the class constructor and not the scoping functionality. Here is an example from Scala shell illustrating the difference:

scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1

scala> new C1
res18: C1 = $iwC$$iwC$C1@2d33c200

scala> new C1()
res19: C1 = $iwC$$iwC$C1@30822879

scala> new C1.setAge(30)  // this doesn't work

<console>:23: error: not found: value C1
          new C1.setAge(30)
              ^

scala> new C1().setAge(30) // this works

scala> 

这篇关于为什么 Scala 编译器失败并显示“无法在包 org.apache.spark 中访问包 spark 中的对象 SparkConf"?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆