嵌套json中的结构化流式传输不同架构 [英] structured streaming different schema in nested json

查看:73
本文介绍了嵌套json中的结构化流式传输不同架构的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个场景,传入的消息是一个Json,它的标题是表名,而数据部分则具有表列数据.现在,我想将此内容写入单独的文件夹,例如/emp/dept.我可以通过基于tablname聚合行来在常规流中实现此目的.但是在结构化流中,我无法拆分.如何在结构化流媒体中实现这一目标.

Hi I have a scenario where the incoming message is a Json which has a header say tablename and the data part has the table column data. Now i want to write this to parquet to separate folders say /emp and /dept. I can achieve this in regular streaming by aggregating rows based on the tablname. But in structured streaming I am unable to split this. How can I achieve this in structured streaming.

{"tableName":员工",数据":{"empid":1," empname:" john," dept:" CS} {"tableName":雇员","data":{"empid":2","empname":"james","dept":"CS"} {"tableName":"dept","data":{"dept":"1","deptname":"CS","desc":计算机 科学部"}

{"tableName":"employee","data":{"empid":1","empname":"john","dept":"CS"} {"tableName":"employee","data":{"empid":2","empname":"james","dept":"CS"} {"tableName":"dept","data":{"dept":"1","deptname":"CS","desc":"COMPUTER SCIENCE DEPT"}

推荐答案

我通过遍历预期表的列表并随后遍历每个表来完成此工作 从数据框中过滤记录并应用架构&特定于表的编码器,然后写入接收器.因此,读取仅发生一次,并且对于每个表,writeStream都会被调用,并且其工作正常.感谢您的所有帮助

i got this working by looping through the list of expected tables and for each of then filter the records from the dataframe and apply the schema & encoder specific to the table and then write to sink . So the read happens only once and for each table writeStream will be called and its working fine. Thanks for all the help

这也将根据表对木地板输出文件夹进行动态分区.

This takes care of dynamic partitioning of the parquet output folder based on the tables as well.

这篇关于嵌套json中的结构化流式传输不同架构的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆