SQLContext.gerorCreate不是一个值 [英] SQLContext.gerorCreate is not a value

查看:48
本文介绍了SQLContext.gerorCreate不是一个值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我遇到错误SQLContext.gerorCreate不是对象org.apache.spark.SQLContext的值.这是我的代码

I am getting error SQLContext.gerorCreate is not a value of object org.apache.spark.SQLContext. This is my code

import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext
import org.apache.spark.streaming.Seconds
import org.apache.spark.streaming.kafka.KafkaUtils
import org.apache.spark.sql.functions
import org.apache.spark.sql.SQLContext
import org.apache.spark.sql.types
import org.apache.spark.SparkContext
import java.io.Serializable
case class Sensor(id:String,date:String,temp:String,press:String)
object consum {
 def main(args: Array[String]) {
  val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[2]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val sc=new SparkContext(sparkConf) 
val lines = KafkaUtils.createStream(ssc, "localhost:2181", "spark-streaming-consumer-group", Map("hello" -> 5))


def parseSensor(str:String): Sensor={
    val p=str.split(",")
    Sensor(p(0),p(1),p(2),p(3))
  }
val data=lines.map(_._2).map(parseSensor)
val sqlcontext=new SQLContext(sc)

import sqlcontext.implicits._
data.foreachRDD { rdd=>



val sensedata=sqlcontext.getOrCreate(rdd.sparkContext) 
}

我也尝试过使用SQLContext.getOrCreate,但存在相同的错误.

I have tried with SQLContext.getOrCreate as well but same error.

推荐答案

没有为SparkContextSQLContext 定义此类getOrCreate函数.

getOrCreate函数是为从中创建SparkSession实例的SparkSession 实例定义的.然后,我们从使用getOrCreate方法调用创建的SparkSession实例中获取sparkContext实例或sqlContext实例.

getOrCreate function is defined for SparkSession instances from which SparkSession instances are created. And we get sparkContext instance or sqlContext instance from the SparkSession instance created using getOrCreate method call.

我希望解释清楚.

已更新

我上面所做的解释适用于更高版本的spark.在博客中在OP引用时,作者使用的是spark 1.6和

The explanation I did above is suitable for higher versions of spark. In the blog as the OP is referencing, the author is using spark 1.6 and the api doc of 1.6.3 clearly states

获取单例SQLContext(如果存在)或使用给定的SparkContext创建一个新的单例SQLp

Get the singleton SQLContext if it exists or create a new one using the given SparkContext

这篇关于SQLContext.gerorCreate不是一个值的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆