spark sql 字符串到时间戳缺少毫秒 [英] spark sql string to timestamp missing milliseconds

查看:147
本文介绍了spark sql 字符串到时间戳缺少毫秒的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

为什么是:

import spark.implicits._
  val content = Seq(("2019", "09", "11","17","16","54","762000000")).toDF("year", "month", "day", "hour", "minute", "second", "nano")
  content.printSchema
  content.show
  content.withColumn("event_time_utc", to_timestamp(concat('year, 'month, 'day, 'hour, 'minute, 'second), "yyyyMMddHHmmss"))
    .withColumn("event_time_utc_millis", to_timestamp(concat('year, 'month, 'day, 'hour, 'minute, 'second, substring('nano, 0, 3)), "yyyyMMddHHmmssSSS"))
    .select('year, 'month, 'day, 'hour, 'minute, 'second, 'nano,substring('nano, 0, 3), 'event_time_utc, 'event_time_utc_millis)
    .show

错过了毫秒?

+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+
|year|month|day|hour|minute|second|     nano|substring(nano, 0, 3)|     event_time_utc|event_time_utc_millis|
+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+
|2019|   09| 11|  17|    16|    54|762000000|                  762|2019-09-11 17:16:54|  2019-09-11 17:16:54|
+----+-----+---+----+------+------+---------+---------------------+-------------------+---------------------+

对于格式字符串:yyyyMMddHHmmssSSS,如果我没记错的话,它应该包括 SSS 中的毫秒数.

for a format string of: yyyyMMddHHmmssSSS which should include the milliseconds in SSS if I am not mistaken.

推荐答案

我也遇到过类似的问题,官方文档在下面一行说到 spark <2.4:

I have faced similar problem, Official Document says below line till spark <2.4:

将时间字符串转换为带有指定的 Unix 时间戳 (以秒为单位)格式(见[http://docs.oracle.com/javase/tutorial/i18n/format/simpleDateFormat.html])到Unix时间戳(以秒为单位),如果失败则返回null.

Convert time string to a Unix timestamp (in seconds) with a specified format (see [http://docs.oracle.com/javase/tutorial/i18n/format/simpleDateFormat.html]) to Unix timestamp (in seconds), return null if fail.

这意味着它只处理秒.

Spark>= 2.4 也可以处理SSS.

解决方案:UDF下面将有助于处理这种情况:

Solution: Below UDF will help to handle this scenario:

import java.text.SimpleDateFormat
import java.sql.Timestamp
import org.apache.spark.sql.functions._
import scala.util.{Try, Success, Failure}

val getTimestampWithMilis: ((String , String) => Option[Timestamp]) = (input, frmt) => input match {
  case "" => None
  case _ => {
    val format = new SimpleDateFormat(frmt)
    Try(new Timestamp(format.parse(input).getTime)) match {
      case Success(t) => Some(t)
      case Failure(_) => None
    }    
  }
}

val getTimestampWithMilisUDF = udf(getTimestampWithMilis)

例如:

val content = Seq(("2019", "09", "11","17","16","54","762000000")).toDF("year", "month", "day", "hour", "minute", "second", "nano")
val df = content.withColumn("event_time_utc", concat('year, 'month, 'day, 'hour, 'minute, 'second, substring('nano, 0, 3)))
df.show
+----+-----+---+----+------+------+---------+-----------------+
|year|month|day|hour|minute|second|     nano|   event_time_utc|
+----+-----+---+----+------+------+---------+-----------------+
|2019|   09| 11|  17|    16|    54|762000000|20190911171654762|
+----+-----+---+----+------+------+---------+-----------------+

df.withColumn("event_time_utc_millis", getTimestampWithMilisUDF($"event_time_utc", lit("yyyyMMddHHmmssSSS"))).show(1, false)
+----+-----+---+----+------+------+---------+-----------------+-----------------------+
|year|month|day|hour|minute|second|nano     |event_time_utc   |event_time_utc_millis  |
+----+-----+---+----+------+------+---------+-----------------+-----------------------+
|2019|09   |11 |17  |16    |54    |762000000|20190911171654762|2019-09-11 17:16:54.762|
+----+-----+---+----+------+------+---------+-----------------+-----------------------+

root
 |-- year: string (nullable = true)
 |-- month: string (nullable = true)
 |-- day: string (nullable = true)
 |-- hour: string (nullable = true)
 |-- minute: string (nullable = true)
 |-- second: string (nullable = true)
 |-- nano: string (nullable = true)
 |-- event_time_utc: string (nullable = true)
 |-- event_time_utc_millis: timestamp (nullable = true)

这篇关于spark sql 字符串到时间戳缺少毫秒的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆