如何使用Scala在Spark 2.1中将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳? [英] How to convert a string column with milliseconds to a timestamp with milliseconds in Spark 2.1 using Scala?
问题描述
我正在使用带有Scala的Spark 2.1.
I am using Spark 2.1 with Scala.
如何将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳?
How to convert a string column with milliseconds to a timestamp with milliseconds?
我从问题
但是我得到的结果没有毫秒: But I get the result without milliseconds:
带有SimpleDateFormat的UDF可以工作.这个想法来自Ram Ghadiyaram到UDF 逻辑的链接. UDF with SimpleDateFormat works. The idea is taken from the Ram Ghadiyaram's link to an UDF logic. 输出: 这篇关于如何使用Scala在Spark 2.1中将以毫秒为单位的字符串列转换为以毫秒为单位的时间戳?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!+---+-----------------------+---------------------+
|id |dts |ts |
+---+-----------------------+---------------------+
|1 |05/26/2016 01:01:01.601|2016-05-26 01:01:01.0|
|2 |#$@#@# |null |
+---+-----------------------+---------------------+
推荐答案
import java.text.SimpleDateFormat
import java.sql.Timestamp
import org.apache.spark.sql.functions.udf
import scala.util.{Try, Success, Failure}
val getTimestamp: (String => Option[Timestamp]) = s => s match {
case "" => None
case _ => {
val format = new SimpleDateFormat("MM/dd/yyyy' 'HH:mm:ss.SSS")
Try(new Timestamp(format.parse(s).getTime)) match {
case Success(t) => Some(t)
case Failure(_) => None
}
}
}
val getTimestampUDF = udf(getTimestamp)
val tdf = Seq((1L, "05/26/2016 01:01:01.601"), (2L, "#$@#@#")).toDF("id", "dts")
val tts = getTimestampUDF($"dts")
tdf.withColumn("ts", tts).show(2, false)
+---+-----------------------+-----------------------+
|id |dts |ts |
+---+-----------------------+-----------------------+
|1 |05/26/2016 01:01:01.601|2016-05-26 01:01:01.601|
|2 |#$@#@# |null |
+---+-----------------------+-----------------------+