在 spark scala 中编写嵌套的 JSON [英] Writing nested JSON in spark scala
本文介绍了在 spark scala 中编写嵌套的 JSON的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我的 Spark-SQL 通过连接两个具有一对多基数的表来生成查询输出.我必须将数据转换为 JSON.这就是查询输出的样子.
My Spark-SQL is generating an output for a query by joining two tables which has one to many cardinality. I have to convert the data into JSON. This is how the output of the query will look like.
Address_id_parent | Address_id_child | Country_child | city_child
1 | 1 | India | Delhi
1 | 1 | US | NewYork
1 | 1 | US | NewJersey
上面的数据要这样转换成JSON.
The above data has to be converted to JSON in this way.
{
"Address": {
"Address_id_parent": "1"
},
"Address-details": [{
"Address_id_child": "1",
"location": [{
"country":"India",
"city":"Delhi",
},
{
"country":"US",
"city":"NewYork",
},
{
"country":"US",
"city":"NewJersey",
}
]
}]
}
我怎样才能做到这一点?
How can I accomplish this?
推荐答案
Check Dataframe write interface with json:
Check Dataframe write interface with json:
df.write.format("json").save(path)
这篇关于在 spark scala 中编写嵌套的 JSON的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文