在Spark Scala中编写嵌套JSON [英] Writing nested JSON in spark scala
本文介绍了在Spark Scala中编写嵌套JSON的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我的Spark-SQL通过连接两个具有一对多基数的表来生成查询的输出.我必须将数据转换为JSON. 这就是查询输出的样子.
My Spark-SQL is generating an output for a query by joining two tables which has one to many cardinality. I have to convert the data into JSON. This is how the output of the query will look like.
Address_id_parent | Address_id_child | Country_child | city_child
1 | 1 | India | Delhi
1 | 1 | US | NewYork
1 | 1 | US | NewJersey
以上数据必须以这种方式转换为JSON.
The above data has to be converted to JSON in this way.
{
"Address": {
"Address_id_parent": "1"
},
"Address-details": [{
"Address_id_child": "1",
"location": [{
"country":"India",
"city":"Delhi",
},
{
"country":"US",
"city":"NewYork",
},
{
"country":"US",
"city":"NewJersey",
}
]
}]
}
我该怎么做?
推荐答案
使用json检查Dataframe写入接口:
Check Dataframe write interface with json:
df.write.format("json").save(path)
这篇关于在Spark Scala中编写嵌套JSON的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文