SparkSql不支持日期格式 [英] SparkSql not supporting Date Format

查看:777
本文介绍了SparkSql不支持日期格式的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图用最新文件,sparkSql,但它不工作

一样,例如在
<一href=\"https://github.com/apache/spark/blob/master/examples/src/main/java/org/apache/spark/examples/sql/JavaSparkSQL.java\"相对=nofollow> javaSparkSqlExample

我尝试添加datecoloumn DOB

在Person类我添加

setter和getter的DOB的日期

当试图执行

  SELECT出生日期,姓名,年龄,COUNT(*)作为TOTALCOUNT由人WHERE DOB&GT; ='1995-01-01'AND年龄&LT; ='2014年2月1日';

同样,在查询代替&下也之间使用尝试; =&放大器; > =以及

  /Volumes/Official/spark-1.0.2-bin-hadoop2$:斌/火花提交--classSimpleApp--master本地[4]尝试/简单 - 项目/目标/简单项目1.0.jar
星火装配已建成蜂巢,包括类路径DataNucleus的罐子
2014年8月21日11:42:47.360的java [955:1903年]无法加载从SCDynamicStore境界映射信息
===数据来源:RDD ===
异常线程mainscala.MatchError:类java.util.Date(java.lang.Class类)


解决方案

它仍然在等待处理,而不是日期您可以使用时间戳
SPARK-2552


  

星火SQL目前支持时间戳,而不是日期。


我们会有机会上等待一段时间,直到1.2.0版本。

详细内容:


  • 类型:改进改进

  • 状态:打开

  • 优先级:小

  • 解决:未解决

  • 影响版本/ S:1.0.1

  • 修正版本/ S:无

  • 分量/ S:SQL

  • 目标版本/ S:1.2.0

I tried to use date file with sparkSql but its not working

Like for in example javaSparkSqlExample

I tried to add datecoloumn dob

In the Person class i added

setter and getter for dob as Date

when tried to execute

SELECT dob,name,age,count(*) as totalCount FROM Person WHERE dob >= '1995-01-01' AND age <= '2014-02-01';

Also tried using between also in the query instead of <= & >= as well

/Volumes/Official/spark-1.0.2-bin-hadoop2$: bin/spark-submit --class "SimpleApp" --master local[4] try/simple-project/target/simple-project-1.0.jar 
Spark assembly has been built with Hive, including Datanucleus jars on classpath
2014-08-21 11:42:47.360 java[955:1903] Unable to load realm mapping info from SCDynamicStore
=== Data source: RDD ===
Exception in thread "main" scala.MatchError: class java.util.Date (of class java.lang.Class)

解决方案

It's still in pending, instead Date you can use Timestamp in Person class SPARK-2552

Spark SQL currently supports Timestamp, but not Date.

We are going tohave to wait for a while until the 1.2.0 version.

Details:

  • Type: Improvement Improvement
  • Status: Open
  • Priority: Minor
  • Resolution: Unresolved
  • Affects Version/s: 1.0.1
  • Fix Version/s: None
  • Component/s: SQL
  • Target Version/s: 1.2.0

这篇关于SparkSql不支持日期格式的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆