spark unix_timestamp数据类型不匹配 [英] spark unix_timestamp data type mismatch
问题描述
有人可以帮我指导我需要提交哪种数据类型或格式,才能使from_unixtime()函数正常工作吗?
Could someone help guide me in what data type or format I need to submit from_unixtime for the spark from_unixtime() function to work?
当我尝试以下操作时,它可以工作,但不会以current_timestamp响应.
When I try the following it works, but responds not with current_timestamp.
from_unixtime(current_timestamp())
响应如下:
fromunixtime(currenttimestamp(),yyyy-MM-dd HH:mm:ss)
当我尝试输入
from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")
以上只是由于类型不匹配而失败:
The above simply fails with a type mismatch:
错误:类型不匹配; 找到:整数(1392394861) 必需:org.apache.spark.sql.Column from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")
error: type mismatch; found : Int(1392394861) required: org.apache.spark.sql.Column from_unixtime(1392394861,"yyyy-MM-dd HH:mm:ss.SSSS")
我想念什么?我尝试了许多不同的方法,并尝试阅读有关在Spark中使用日期/时间的文档,并且我尝试过的每个示例都因类型不匹配而失败.
What am I missing? I've tried a number of different things and tried reading documentation on using date/time in spark and every example I've tried fails with type mismatches.
推荐答案
Use lit() to create a column of literal value, like this:
from_unixtime(lit(1392394861), "yyyy-MM-dd HH:mm:ss.SSSS")
或者,如zero323所述:
or, as zero323 mentioned:
from_unixtime(current_timestamp().cast("long"))
这篇关于spark unix_timestamp数据类型不匹配的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!