将StructType爆炸为MapType Spark [英] Exploding StructType as MapType Spark
本文介绍了将StructType爆炸为MapType Spark的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
在Spark中将structType转换为MapType.
Converting structType to MapType in Spark.
模式:
event: struct (nullable = true)
| | event_category: string (nullable = true)
| | event_name: string (nullable = true)
| | properties: struct (nullable = true)
| | | prop1: string (nullable = true)
| | | prop2: string (nullable = true)
样本数据:
{ "event": {
"event_category: "abc",
"event_name": "click",
"properties" : {
"prop1": "prop1Value",
"prop2": "prop2Value",
....
}
}
}
需要的值为:
event_category | event_name | properties_key | properties_value |
abc | click | prop1 | prop1Value
abc | click | prop2 | prop2Value
推荐答案
您将必须找到某种机制来创建 properties
struct map >.我已经使用 udf
函数对 key 和 values 进行 zip
并返回以下内容的 arrays
键和值.
You will have to find some mechanism to create map
of properties
struct. I have used udf
function to zip
the key and values and return arrays
of key and value.
import org.apache.spark.sql.functions._
def collectUdf = udf((cols: collection.mutable.WrappedArray[String], values: collection.mutable.WrappedArray[String]) => cols.zip(values))
火花 中不支持多个生成器,因此您必须将 dataframe
保存到临时的 dataframe
中.
val columnsMap = df_json.select($"event.properties.*").columns
val temp = df_json.withColumn("event_properties", explode(collectUdf(lit(columnsMap), array($"event.properties.*"))))
最后一步是仅分隔 event_properties
列
temp.select($"event.event_category", $"event.event_name", $"event_properties._1".as("properties_key"), $"event_properties._2".as("properties_value")).show(false)
您应该拥有自己想要的东西
You should have what you desire
+--------------+----------+--------------+----------------+
|event_category|event_name|properties_key|properties_value|
+--------------+----------+--------------+----------------+
|abc |click |prop1 |prop1Value |
|abc |click |prop2 |prop2Value |
+--------------+----------+--------------+----------------+
这篇关于将StructType爆炸为MapType Spark的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文