在 Spark Scala 中将时间戳转换为 UTC [英] Converting timestamp to UTC in Spark Scala

查看:167
本文介绍了在 Spark Scala 中将时间戳转换为 UTC的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我的环境是 Spark 2.1,Scala

My Environment is Spark 2.1, Scala

这可能很简单,但我头都碎了.

This could be simple, but I am breaking my head.

我的数据框,myDF 就像波纹管 -

My Dataframe, myDF is like bellow -

+--------------------+----------------+  
|     orign_timestamp | origin_timezone|  
+--------------------+----------------+  
|2018-05-03T14:56:...|America/St_Johns|  
|2018-05-03T14:56:...| America/Toronto|  
|2018-05-03T14:56:...| America/Toronto|    
|2018-05-03T14:56:...| America/Toronto|  
|2018-05-03T14:56:...| America/Halifax|  
|2018-05-03T14:56:...| America/Toronto|  
|2018-05-03T14:56:...| America/Toronto|  
+--------------------+----------------+   

我需要将 orign_timestamp 转换为 UTC 并将作为新列添加到 DF.下面的代码工作正常.

I need to convert orign_timestamp to UTC and add as new column to DF. Code bellow is working fine.

myDF.withColumn("time_utc", to_utc_timestamp(from_unixtime(unix_timestamp(col("orign_timestamp"), "yyyy-MM-dd'T'HH:mm:ss")),("America/Montreal"))).show

问题是我将时区固定为美国/蒙特利尔".我需要通过 timeZone 表单orign_timeone"列.我试过

Problem is I have fixed timezone to "America/Montreal". I need to pass timeZone form "orign_timeone" column. I tried

myDF.withColumn("time_utc", to_utc_timestamp(from_unixtime(unix_timestamp(col("orign_timestamp"), "yyyy-MM-dd'T'HH:mm:ss")), col("orign_timezone".toString.trim))).show

got Error:
<console>:34: error: type mismatch;
 found   : org.apache.spark.sql.Column
 required: String

我尝试了下面的代码,没有通过异常但新列的时间与 origin_time 相同.

I tried code bellow, did not through exception but new column had same time as origin_time.

myDF.withColumn("origin_timestamp", to_utc_timestamp(from_unixtime(unix_timestamp(col("orign_timestamp"), "yyyy-MM-dd'T'HH:mm:ss")), col("rign_timezone").toString)).show

推荐答案

每当你遇到这样的问题时,你都可以使用expr

Whenever you experience problem like this one, you can use expr

import org.apache.spark.sql.functions._

val df = Seq(
  ("2018-05-03T14:56:00", "America/St_Johns"), 
  ("2018-05-03T14:56:00", "America/Toronto"), 
  ("2018-05-03T14:56:00", "America/Halifax")
).toDF("origin_timestamp", "origin_timezone")

df.withColumn("time_utc",
  expr("to_utc_timestamp(origin_timestamp, origin_timezone)")
).show

// +-------------------+----------------+-------------------+
// |   origin_timestamp| origin_timezone|           time_utc|
// +-------------------+----------------+-------------------+
// |2018-05-03T14:56:00|America/St_Johns|2018-05-03 17:26:00|
// |2018-05-03T14:56:00| America/Toronto|2018-05-03 18:56:00|
// |2018-05-03T14:56:00| America/Halifax|2018-05-03 17:56:00|
// +-------------------+----------------+-------------------+

selectExpr:

df.selectExpr(
  "*", "to_utc_timestamp(origin_timestamp, origin_timezone) as time_utc"
).show

// +-------------------+----------------+-------------------+
// |   origin_timestamp| origin_timezone|           time_utc|
// +-------------------+----------------+-------------------+
// |2018-05-03T14:56:00|America/St_Johns|2018-05-03 17:26:00|
// |2018-05-03T14:56:00| America/Toronto|2018-05-03 18:56:00|
// |2018-05-03T14:56:00| America/Halifax|2018-05-03 17:56:00|
// +-------------------+----------------+-------------------+

这篇关于在 Spark Scala 中将时间戳转换为 UTC的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆