将日期转换为时间戳,将unix_timestamp的Spark日期转换为时间戳的问题返回null [英] Problems to convert date to timestamp, Spark date to timestamp from unix_timestamp return null
本文介绍了将日期转换为时间戳,将unix_timestamp的Spark日期转换为时间戳的问题返回null的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
将日期转换为时间戳,将Spark日期从时间戳转换为unix_timestamp的问题返回null.
Problems to convert date to timestamp, Spark date to timestamp from unix_timestamp return null.
scala> import org.apache.spark.sql.functions.unix_timestamp
scala> spark.sql("select from_unixtime(unix_timestamp(('2017-08-13 00:06:05'),'yyyy-MM-dd HH:mm:ss')) AS date").show(false)
+----+
|date|
+----+
|null|
+----+
推荐答案
问题是智利的时间发生了变化,非常感谢.
The problem was the change of time in Chile, thank you very much.
+-------------------+---------+
| DateIntermedia|TimeStamp|
+-------------------+---------+
|13-08-2017 00:01:07| null|
|13-08-2017 00:10:33| null|
|14-08-2016 00:28:42| null|
|13-08-2017 00:04:43| null|
|13-08-2017 00:33:51| null|
|14-08-2016 00:28:08| null|
|14-08-2016 00:15:34| null|
|14-08-2016 00:21:04| null|
|13-08-2017 00:34:13| null|
+-------------------+---------+
解决方案,设置timeZone:
The solution, set timeZone:
spark.conf.set("spark.sql.session.timeZone", "UTC-6")
这篇关于将日期转换为时间戳,将unix_timestamp的Spark日期转换为时间戳的问题返回null的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文