SPARK:失败:``联盟'预期,但'('发现 [英] SPARK : failure: ``union'' expected but `(' found

查看:1522
本文介绍了SPARK:失败:``联盟'预期,但'('发现的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个名为DF与指定的列EMPLOYEE_ID数据帧。我做的:

  df.registerTempTable(D_F)
VAL查询=SELECT *,ROW_NUMBER)OVER(ORDER BY EMPLOYEE_ID)(ROW_NUMBER FROM D_F
VAL结果= Spark.getSqlContext()。SQL(查询)

但是,得到以下问题。任何帮助吗?

  [1.29]失败:``联盟'预期,但'('发现
SELECT *,ROW_NUMBER()OVER(ORDER BY EMPLOYEE_ID)ROW_NUMBER FROM D_F
                            ^
了java.lang.RuntimeException:[1.29]失败:``联盟'预期,但'('发现
SELECT *,ROW_NUMBER()OVER(ORDER BY EMPLOYEE_ID)ROW_NUMBER FROM D_F


解决方案

星火2.0 +

星火2.0引入了本地实现的窗函数( SPARK-8641 ),所以 HiveContext 应不再需要。然而类似的错误,不相关的窗口的功能,仍然可以归结到SQL解析器之间的差异。

星火< = 1.6

窗口功能已在星火1.4.0被引入,并要求<一个href=\"https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext\"相对=nofollow> HiveContext 工作。 SQLContext 不会在这里工作。

要确保你使用的Spark> = 1.4.0,并创建 HiveContext

 进口org.apache.spark.sql.hive.HiveContext
VAL sqlContext =新HiveContext(SC)

I have a dataframe called df with column named employee_id. I am doing:

 df.registerTempTable("d_f")
val query = """SELECT *, ROW_NUMBER() OVER (ORDER BY employee_id) row_number FROM d_f"""
val result = Spark.getSqlContext().sql(query)

But getting following issue. Any help?

[1.29] failure: ``union'' expected but `(' found
SELECT *, ROW_NUMBER() OVER (ORDER BY employee_id) row_number FROM d_f
                            ^
java.lang.RuntimeException: [1.29] failure: ``union'' expected but `(' found
SELECT *, ROW_NUMBER() OVER (ORDER BY employee_id) row_number FROM d_f

解决方案

Spark 2.0+

Spark 2.0 introduces native implementation of window functions (SPARK-8641) so HiveContext should be no longer required. Nevertheless similar errors, not related to window functions, can be still attributed to the differences between SQL parsers.

Spark <= 1.6

Window functions have been introduced in Spark 1.4.0 and require HiveContext to work. SQLContext won't work here.

Be sure you you use Spark >= 1.4.0 and create the HiveContext:

import org.apache.spark.sql.hive.HiveContext
val sqlContext = new HiveContext(sc)

这篇关于SPARK:失败:``联盟'预期,但'('发现的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆