SQLContext.gerorCreate 不是值 [英] SQLContext.gerorCreate is not a value

查看:19
本文介绍了SQLContext.gerorCreate 不是值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我收到错误 SQLContext.gerorCreate is not a value of object org.apache.spark.SQLContext.这是我的代码

import org.apache.spark.SparkConf导入 org.apache.spark.streaming.StreamingContext导入 org.apache.spark.streaming.Seconds导入 org.apache.spark.streaming.kafka.KafkaUtils导入 org.apache.spark.sql.functions导入 org.apache.spark.sql.SQLContext导入 org.apache.spark.sql.types导入 org.apache.spark.SparkContext导入 java.io.Serializablecase class Sensor(id:String,date:String,temp:String,press:String)对象消费{def main(args: Array[String]) {val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[2]")val ssc = new StreamingContext(sparkConf, Seconds(2))val sc=new SparkContext(sparkConf)val lines = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("hello" -> 5))def parseSensor(str:String): 传感器={val p=str.split(",")传感器(p(0),p(1),p(2),p(3))}val data=lines.map(_._2).map(parseSensor)val sqlcontext=new SQLContext(sc)导入 sqlcontext.implicits._data.foreachRDD { rdd=>val sensedata=sqlcontext.getOrCreate(rdd.sparkContext)}

我也尝试过 SQLContext.getOrCreate 但同样的错误.

解决方案

没有为 SparkContextSQLContext<定义这样的 getOrCreate 函数/代码>.

getOrCreate 函数是为创建 SparkSession 实例的 SparkSession 实例定义的.我们从使用 getOrCreate 方法调用创建的 SparkSession 实例中获取 sparkContext 实例或 sqlContext 实例.

我希望解释清楚.

更新

我上面做的解释适用于更高版本的spark.在 博客中 作为 OP 引用,作者使用的是 spark 1.6 和 1.6.3 的 api 文档 明确指出

<块引用><块引用>

获取单例 SQLContext(如果存在)或使用给定的 SparkContext 创建一个新的 SQLContext

I am getting error SQLContext.gerorCreate is not a value of object org.apache.spark.SQLContext. This is my code

import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.sql.functions
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types
import org.apache.spark.SparkContext
import java.io.Serializable
case class Sensor(id:String,date:String,temp:String,press:String)
object consum {
 def main(args: Array[String]) {
  val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[2]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val sc=new SparkContext(sparkConf) 
val lines = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("hello" -> 5))


def parseSensor(str:String): Sensor={
    val p=str.split(",")
    Sensor(p(0),p(1),p(2),p(3))
  }
val data=lines.map(_._2).map(parseSensor)
val sqlcontext=new SQLContext(sc)

import sqlcontext.implicits._
data.foreachRDD { rdd=>



val sensedata=sqlcontext.getOrCreate(rdd.sparkContext) 
}

I have tried with SQLContext.getOrCreate as well but same error.

解决方案

There is no such getOrCreate function defined for neither SparkContext nor SQLContext.

getOrCreate function is defined for SparkSession instances from which SparkSession instances are created. And we get sparkContext instance or sqlContext instance from the SparkSession instance created using getOrCreate method call.

I hope the explanation is clear.

Updated

The explanation I did above is suitable for higher versions of spark. In the blog as the OP is referencing, the author is using spark 1.6 and the api doc of 1.6.3 clearly states

Get the singleton SQLContext if it exists or create a new one using the given SparkContext

这篇关于SQLContext.gerorCreate 不是值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆