将unix时间戳转换为avro并将其存储在BigQuery中 [英] Convert unix timestamp to avro and store it in BigQuery
问题描述
Avro模式:
{
"name": "Entity",
"type": "record",
"namespace": "com.foobar.entity",
"fields": [
{
"name": "attribute",
"type": "string"
},
{
"name": "value",
"type": "int"
},
{
"name": "timestamp",
"type": { "type": "long", "logicalType": "timestamp-micros" }
}
]
}
源时间戳是Unix格式,精度为毫秒.
The source timestamp is in Unix format with milli second precision.
将此类记录放入BigQuery时,在BigQuery数据预览中会得到类似 1970-01-19 01:18:19.415 UTC
的值.但是我存储的值是 1559899418
,它是星期五,2019年6月7日09:23:38
.有什么想法吗?
When I put such records into BigQuery I get values like 1970-01-19 01:18:19.415 UTC
in the BigQuery data preview. However the value I stored is 1559899418
which is Friday, 7. June 2019 09:23:38
. Any ideas why?
参考: https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-avro#logical_types
推荐答案
您的时间戳偏离了1000倍.实际上, 1559899418
对应于星期五,7 June 2019 09:23:38
,但这是秒精度( Unix时间戳),而不是毫秒.而 1559899
(1559899418的千分之一)确实对应于 1970-01-19 01:18:19
Your timestamp is off by a factor 1000. Indeed, 1559899418
corresponds to Friday, 7. June 2019 09:23:38
, but that's second-precision (Unix timestamp), not millisecond.
And 1559899
(one thousandth of 1559899418) does indeed correspond to 1970-01-19 01:18:19
这篇关于将unix时间戳转换为avro并将其存储在BigQuery中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!