无法从直线访问 Spark 2.0 临时表 [英] Can't access Spark 2.0 Temporary Table from beeline

查看:27
本文介绍了无法从直线访问 Spark 2.0 临时表的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在 Spark 1.5.1 中,我已经能够使用 Thrift Server 从 Beeline 访问 spark-shell 临时表.通过阅读 Stackoverflow 上相关问题的答案,我已经能够做到这一点.

With Spark 1.5.1, I've already been able to access spark-shell temporary tables from Beeline using Thrift Server. I've been able to do so by reading answers to related questions on Stackoverflow.

但是,升级到 Spark 2.0 后,我无法再从 Beeline 看到临时表,这是我正在遵循的步骤.

However, after upgrading to Spark 2.0, I can't see temporary tables from Beeline anymore, here are the steps I'm following.

我正在使用以下命令启动 spark-shell:

I'm launching spark-shell using the following command:

./bin/spark-shell --master=myHost.local:7077 —conf spark.sql.hive.thriftServer.singleSession=true

一旦 spark shell 准备好,我输入以下几行来启动 thrift 服务器并从一个数据框创建一个临时视图,该数据框在 json 文件中获取其源

Once the spark shell is ready I enter the following lines to launch thrift server and create a temporary view from a data frame taking its source in a json file

import org.apache.spark.sql.hive.thriftserver._

spark.sqlContext.setConf("hive.server2.thrift.port","10002")
HiveThriftServer2.startWithContext(spark.sqlContext)
val df = spark.read.json("examples/src/main/resources/people.json")
df.createOrReplaceTempView("people")
spark.sql("select * from people").show()

最后一条语句显示表格,运行良好.

The last statement displays the table, it runs fine.

但是,当我启动 beeline 并登录到我的 thrift 服务器实例时,我看不到任何临时表:

However when I start beeline and log to my thrift server instance, I can't see any temporary tables:

show tables;
+------------+--------------+--+
| tableName  | isTemporary  |
+------------+--------------+--+
+------------+--------------+--+
No rows selected (0,658 seconds)

我是否遗漏了有关从 1.5.1 到 2.0 的 Spark 升级,我如何才能访问我的临时表?

Did I miss something regarding my spark upgrade from 1.5.1 to 2.0, how can I gain access to my temporary tables ?

推荐答案

这在升级到 spark 2.0.1 后对我有用

This worked for me after upgrading to spark 2.0.1

 val sparkConf = 
        new SparkConf()
            .setAppName("Spark Thrift Server Demo")
            .setMaster(sparkMaster)
            .set("hive.metastore.warehouse.dir", hdfsDataUri + "/hive")

      val spark = SparkSession
      .builder()
      .enableHiveSupport()
      .config(sparkConf)
      .getOrCreate()

  val sqlContext = new org.apache.spark.sql.SQLContext(spark.sparkContext) 
      HiveThriftServer2.startWithContext(sqlContext)

这篇关于无法从直线访问 Spark 2.0 临时表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆