无法从Beeline访问Spark 2.0临时表 [英] Can't access Spark 2.0 Temporary Table from beeline

查看:191
本文介绍了无法从Beeline访问Spark 2.0临时表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

有了Spark 1.5.1,我已经能够使用Thrift Server从Beeline访问spark-shell临时表.通过阅读有关Stackoverflow的相关问题的答案,我已经能够做到这一点.

With Spark 1.5.1, I've already been able to access spark-shell temporary tables from Beeline using Thrift Server. I've been able to do so by reading answers to related questions on Stackoverflow.

但是,升级到Spark 2.0之后,我再也看不到Beeline的临时表,这是我要执行的步骤.

However, after upgrading to Spark 2.0, I can't see temporary tables from Beeline anymore, here are the steps I'm following.

我正在使用以下命令启动spark-shell:

I'm launching spark-shell using the following command:

./bin/spark-shell --master=myHost.local:7077 —conf spark.sql.hive.thriftServer.singleSession=true

火花外壳准备就绪后,我输入以下行以启动thrift服务器并从数据框创建一个临时视图,并将其源保存在json文件中

Once the spark shell is ready I enter the following lines to launch thrift server and create a temporary view from a data frame taking its source in a json file

import org.apache.spark.sql.hive.thriftserver._

spark.sqlContext.setConf("hive.server2.thrift.port","10002")
HiveThriftServer2.startWithContext(spark.sqlContext)
val df = spark.read.json("examples/src/main/resources/people.json")
df.createOrReplaceTempView("people")
spark.sql("select * from people").show()

最后一条语句显示该表,它运行良好.

The last statement displays the table, it runs fine.

但是,当我启动beeline并登录到我的thrift服务器实例时,我看不到任何临时表:

However when I start beeline and log to my thrift server instance, I can't see any temporary tables:

show tables;
+------------+--------------+--+
| tableName  | isTemporary  |
+------------+--------------+--+
+------------+--------------+--+
No rows selected (0,658 seconds)

我是否错过了一些有关将Spark从1.5.1升级到2.0的事情,如何获得对临时表的访问权限?

Did I miss something regarding my spark upgrade from 1.5.1 to 2.0, how can I gain access to my temporary tables ?

推荐答案

这在升级到Spark 2.0.1之后对我有用

This worked for me after upgrading to spark 2.0.1

 val sparkConf = 
        new SparkConf()
            .setAppName("Spark Thrift Server Demo")
            .setMaster(sparkMaster)
            .set("hive.metastore.warehouse.dir", hdfsDataUri + "/hive")

      val spark = SparkSession
      .builder()
      .enableHiveSupport()
      .config(sparkConf)
      .getOrCreate()

  val sqlContext = new org.apache.spark.sql.SQLContext(spark.sparkContext) 
      HiveThriftServer2.startWithContext(sqlContext)

这篇关于无法从Beeline访问Spark 2.0临时表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆