为什么Scala编译器失败,"封装火花对象SparkConf不能在包装org.apache.spark&QUOT ;?访问 [英] Why does Scala compiler fail with "object SparkConf in package spark cannot be accessed in package org.apache.spark"?

查看:617
本文介绍了为什么Scala编译器失败,"封装火花对象SparkConf不能在包装org.apache.spark&QUOT ;?访问的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我不能在包访问 SparkConf 。但我已经导入<$​​ C $ C>进口org.apache.spark.SparkConf 。我的code是:

 进口org.apache.spark.SparkContext
进口org.apache.spark.SparkContext._
进口org.apache.spark.SparkConf
进口org.apache.spark.rdd.RDD进口org.apache.spark._
进口org.apache.spark.streaming._
进口org.apache.spark.streaming.StreamingContext._反对SparkStreaming {
    高清主(ARG:数组[字符串])= {        VAL的conf =新SparkConf.setMaster(本地[2])。setAppName(NetworkWordCount)
        VAL SSC =新的StreamingContext(CONF,秒(1))        VAL线= ssc.socketTextStream(本地主机,9999)
        VAL字= lines.flatMap(_。分裂())
        VAL pairs_new = words.map(W =&GT(W,1))
        VAL wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print()        ssc.start()//开始计算
        ssc.awaitTermination()//等待计算的终止    }
}

SBT 依赖是:

 名称:=星火流版本:=1.0scalaVersion:=2.10.4libraryDependencies ++ = SEQ(
    org.apache.spark%%火花核%1.5.2%规定,
    org.apache.spark%%火花mllib%1.5.2
    org.apache.spark%%火花流%1.5.2

但是,错误显示, SparkConf 无法访问。

  [错误] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31:在包火花对象SparkConf无法访问在包org.apache.spark
[错误] VAL的conf =新SparkConf.setMaster(本地[2])。setAppName(NetworkWordCount)
[错误] ^


解决方案

如果您SparkConf后加上括号它编译:

VAL的conf =新SparkConf()。setMaster(本地[2])。setAppName(NetworkWordCount)

问题的关键是,SparkConf是一类,而不是一个功能,让你可以使用类名也为范围的目的。所以,当你在类名后加括号,你要确保你调用类的构造函数,而不是作用域功能。下面是从斯卡拉壳说明差异的例子:

 斯卡拉&GT; C1类{VAR年龄= 0; DEF setAge(A:强度)= {年龄=一}
定义的类C1斯卡拉&GT;新的C1
res18:C1 = $ IWC万国表$$ $ @ C1 2d33c200斯卡拉&GT;新C1()
res19:C1 = $ IWC万国表$$ $ @ C1 30822879斯卡拉&GT;新C1.setAge(30)//这不起作用&LT;&控制台GT;:23:错误:未找到:C1值
          新C1.setAge(30)
              ^斯卡拉&GT;新C1()。setAge(30)//这个作品斯卡拉&GT;

I cannot access the SparkConf in the package. But I have already import the import org.apache.spark.SparkConf. My code is:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._

object SparkStreaming {
    def main(arg: Array[String]) = {

        val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
        val ssc = new StreamingContext( conf, Seconds(1) )

        val lines = ssc.socketTextStream("localhost", 9999)
        val words = lines.flatMap(_.split(" "))
        val pairs_new = words.map( w => (w, 1) )
        val wordsCount = pairs_new.reduceByKey(_ + _)
        wordsCount.print() 

        ssc.start() // Start the computation
        ssc.awaitTermination() // Wait for the computation to the terminate

    }
}

The sbt dependencies are:

name := "Spark Streaming"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies ++= Seq(
    "org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
    "org.apache.spark" %% "spark-mllib" % "1.5.2",
    "org.apache.spark" %% "spark-streaming" % "1.5.2"
)

But the error shows that SparkConf cannot be accessed.

[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error]         val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error]                        ^

解决方案

It compiles if you add parenthesis after SparkConf:

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

The point is that SparkConf is a class and not a function, so you could use class name also for scope purposes. So when you add parenthesis after the class name, you are making sure you are calling the class constructor and not the scoping functionality. Here is an example from Scala shell illustrating the difference:

scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1

scala> new C1
res18: C1 = $iwC$$iwC$C1@2d33c200

scala> new C1()
res19: C1 = $iwC$$iwC$C1@30822879

scala> new C1.setAge(30)  // this doesn't work

<console>:23: error: not found: value C1
          new C1.setAge(30)
              ^

scala> new C1().setAge(30) // this works

scala> 

这篇关于为什么Scala编译器失败,&QUOT;封装火花对象SparkConf不能在包装org.apache.spark&QUOT ;?访问的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆