使用Spark SQL从ISO 8601解析日期时间 [英] Parsing datetime from ISO 8601 using Spark SQL

查看:130
本文介绍了使用Spark SQL从ISO 8601解析日期时间的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

,但反之亦然.

我的date的格式为YYYY-MM-DDThh:mm:ss,我想要两列YYYY-MM-DDhh:mm,如果需要的话,可以对某些查询进行连接.

My dates are in this format YYYY-MM-DDThh:mm:ss, I want two columns YYYY-MM-DD and hh:mm that I can concat, if I want to, for certain queries.

使用convert()时出现错误;我认为Spark SQL当前不支持此功能.

I get an error when using convert(); I assume this is not supported currently with Spark SQL.

当我使用date(datetime)timestamp(datetime)时,我得到所有返回的空值.但是,minute(datetime)hour(datetime)可以工作.

When I use date(datetime) or timestamp(datetime), I get all null values returned. However, minute(datetime) and hour(datetime) work.

当前,使用此

concat(date,' ', hour,':', (case when minute < 10 then concat('0',minute) else minute end)) as DateTime
from (select OtherDateOnlyColumn as date, minute(datetime) as minute, hour(datetime) as hour from ...)

这显然是无效的.

推荐答案

我刚刚在此查询中尝试使用date(),并且有效:

I just tried with date() on this query and it works:

select date(datetime) from df

也许表中的日期是字符串类型;您应该使用

Maybe the date in your table is string type; you should check the data types of the columns with

DESCRIBE your_table

如果日期是字符串类型,则可以使用cast(datetime as timestamp) as newTimestamp,它在

If the date is string type, you can use cast(datetime as timestamp) as newTimestamp which is available in Spark SQL to convert the datetime back to a timestamp type and use variants of date_format(newTimestamp, 'YYYY-MM-dd hh:mm') from there.

这篇关于使用Spark SQL从ISO 8601解析日期时间的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆