Hive / SparkSQL:如何将Unix时间戳转换为时间戳(不是字符串)? [英] Hive/SparkSQL: How to convert a Unix timestamp into a timestamp (not string)?

查看:2203
本文介绍了Hive / SparkSQL:如何将Unix时间戳转换为时间戳(不是字符串)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述



在Hive / SparkSQL中,如何将unix时间戳[注1]转换为时间戳
数据类型?



(注1:即1970年1月1日以来的秒数/毫秒)



我认为 from_unixtime()会这样做,但它会返回字符串
而不是时间戳。以下实验说明了这个问题:

步骤0:准备 选择
from_unixtime(1508673584)为fut;

结果:

  ----------------------- 
| fut |
| ------------------- |
| 2017-10-22 11:59:44 |
-----------------------

第1步:创建一个表,其结果为 from_unixtime()

  create table test 
选择
from_unixtime(1508673584)为fut;

第2步:检查 fut列的数据类型

  describe test; 

结果:

  ---------------------------------- 
| col_name | data_type |评论|
| -------- | --------- | ------- |
| fut |字符串| <空> |
----------------------------------

我也试过这个

  select 
from_utc_timestamp(1508618794 * 1000,'EDT');

根据手册(链接这里),这应该工作。因为它
指出:


将UTC中的时间戳*覆盖给定的时区(Hive 0.8.0)。 *
timestamp是一个原始类型,包括时间戳记/日期,
tinyint / smallint / int / bigint,float / double和decimal。小数
值被认为是秒。整数值被认为是
毫秒。例如from_utc_timestamp(2592000.0,'PST'),
from_utc_timestamp(2592000000,'PST')和from_utc_timestamp(时间戳
'1970-01-30 16: 00:00','PST')全部返回时间戳1970-01-30
08:00:00

然而,我得到一个错误:

pre $ 错误:org.apache.spark.sql.AnalysisException:
无法解析' from_utc_timestamp((1508618794 * 1000),'EDT')'
由于数据类型不匹配:
参数1需要时间戳类型,
但是,'(1508618794 * 1000)'是int类型。 2号线2号;
'Project [unresolvedalias(from_utc_timestamp((1508618794 * 1000),EDT),None)]
+ - OneRowRelation

SQLState:null
ErrorCode:0


解决方案

(我在这里提供了一个答案。 p>

答案是使用 cast()。这适用于日期时间戳

 选择
from_unixtime(1508673584)作为fut,
cast(from_unixtime(1508673584)作为日期)作为futAsDate,
cast(from_unixtime(1508673584)作为时间戳记)作为futAsTimestamp ;

结果:

  --------------------------------------------- --------------- 
| fut | futAsDate | futAsTimestamp |
| ------------------- | ---------- | --------------------- |
| 2017-10-22 11:59:44 | 2017-10-22 | 2017-10-22 11:59:44.0 |
---------------------------------------------- --------------

验证数据类型

  create table test2 
select
from_unixtime(1508673584)as fut,
cast(from_unixtime(1508673584)as date)as futAsDate,
cast(from_unixtime(1508673584)as timestamp)as futAsTimestamp;

然后

 描述test2; 

结果:

  ---------------------------------------- 
| col_name | data_type |评论|
| -------------- | --------- | ------- |
| fut |字符串| <空> |
| futAsDate |日期| <空> |
| futAsTimestamp |时间戳| <空> |
----------------------------------------


I thought this would be easy ...

In Hive/SparkSQL, how do I convert a unix timestamp [Note 1] into a timestamp data type?

(Note 1: That is, number of seconds/milliseconds since Jan 1, 1970)

I thought from_unixtime() would do that, but it gives back a string instead of a timestamp. The following experiment illustrates the problem

Step 0: Preparation

select 
  from_unixtime(1508673584) as fut;

Result:

-----------------------
| fut                 |
| ------------------- |
| 2017-10-22 11:59:44 |
-----------------------

Step 1: Create a table with the result of from_unixtime()

create table test
select 
  from_unixtime(1508673584) as fut;

Step 2: Examine the datatype of the column fut

describe test;

Result:

----------------------------------
| col_name | data_type | comment |
| -------- | --------- | ------- |
| fut      | string    | <null>  |
----------------------------------

I also tried this

select 
  from_utc_timestamp(1508618794*1000, 'EDT');

According to the manual (link here), this should work. Because it states that:

Coverts a timestamp* in UTC to a given timezone (as of Hive 0.8.0). * timestamp is a primitive type, including timestamp/date, tinyint/smallint/int/bigint, float/double and decimal. Fractional values are considered as seconds. Integer values are considered as milliseconds.. E.g from_utc_timestamp(2592000.0,'PST'), from_utc_timestamp(2592000000,'PST') and from_utc_timestamp(timestamp '1970-01-30 16:00:00','PST') all return the timestamp 1970-01-30 08:00:00

However, I got an error of

Error: org.apache.spark.sql.AnalysisException: 
  cannot resolve 'from_utc_timestamp((1508618794 * 1000), 'EDT')' 
  due to data type mismatch: 
  argument 1 requires timestamp type, 
  however, '(1508618794 * 1000)' is of int type.; line 2 pos 2;
'Project [unresolvedalias(from_utc_timestamp((1508618794 * 1000), EDT), None)]
+- OneRowRelation$

SQLState:  null
ErrorCode: 0    

解决方案

(I am providing an answer myself here.)

The answer is to use cast(). This works for both date and timestamp

select 
  from_unixtime(1508673584)                    as fut,
  cast(from_unixtime(1508673584) as date)      as futAsDate,
  cast(from_unixtime(1508673584) as timestamp) as futAsTimestamp;

Result:

------------------------------------------------------------
| fut                 | futAsDate  | futAsTimestamp        |
| ------------------- | ---------- | --------------------- |
| 2017-10-22 11:59:44 | 2017-10-22 | 2017-10-22 11:59:44.0 |
------------------------------------------------------------

Verification of data types

create table test2
select 
  from_unixtime(1508673584)                    as fut,
  cast(from_unixtime(1508673584) as date)      as futAsDate,
  cast(from_unixtime(1508673584) as timestamp) as futAsTimestamp;

And then

describe test2;  

Result:

----------------------------------------
| col_name       | data_type | comment |
| -------------- | --------- | ------- |
| fut            | string    | <null>  |
| futAsDate      | date      | <null>  |
| futAsTimestamp | timestamp | <null>  |
----------------------------------------

这篇关于Hive / SparkSQL:如何将Unix时间戳转换为时间戳(不是字符串)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆