Apache Spark 从时间戳列中减去天数 [英] Apache Spark subtract days from timestamp column
本文介绍了Apache Spark 从时间戳列中减去天数的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用 Spark 数据集并且无法从时间戳列中减去天数.
I am using Spark Dataset and having trouble subtracting days from a timestamp column.
我想从时间戳列中减去天数并获得具有完整日期时间格式的新列.示例:
I would like to subtract days from Timestamp Column and get new Column with full datetime format. Example:
2017-09-22 13:17:39.900 - 10 ----> 2017-09-12 13:17:39.900
使用 date_sub 函数,我得到 2017-09-12 没有 13:17:39.900.
With date_sub functions I am getting 2017-09-12 without 13:17:39.900.
推荐答案
你 cast
数据到 timestamp
和 expr
减去 >间隔
:
You cast
data to timestamp
and expr
to subtract an INTERVAL
:
import org.apache.spark.sql.functions.expr
val df = Seq("2017-09-22 13:17:39.900").toDF("timestamp")
df.withColumn(
"10_days_before",
$"timestamp".cast("timestamp") - expr("INTERVAL 10 DAYS")).show(false)
+-----------------------+---------------------+
|timestamp |10_days_before |
+-----------------------+---------------------+
|2017-09-22 13:17:39.900|2017-09-12 13:17:39.9|
+-----------------------+---------------------+
如果数据已经是TimestampType
,你可以跳过cast
.
If data is already of TimestampType
you can skip cast
.
这篇关于Apache Spark 从时间戳列中减去天数的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文