我们如何将大型有效载荷压缩成运动流? [英] how to we compress large size payloads into kinesis Streams?

查看:156
本文介绍了我们如何将大型有效载荷压缩成运动流?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个5 MB的JSON负载,我需要使用放置记录将其推送到Kinesis流.由于Kinesis数据大小限制为1 MB,因此我应该遵循哪些方法来压缩数据以及将要执行的步骤

i have a JSON payload of 5 MB which i need to push to Kinesis streams using put records. Since the Kinesis data size limit is 1 MB, which methods i should follow to compress the data and what will be the steps

推荐答案

如果压缩后您的json有效负载仍然太大,那么通常会有两个选项:

If your json payload is still too big after compression, then you have generally two options:

  1. 拆分为多个较小的有效负载.使用者必须能够根据您的有效载荷的 part id 来重构有效载荷.

  1. Split it into multiple smaller payloads. The consumers would have to be able to reconstruct the payloads based on a part id of your payload.

大型有效负载数据存储在流外部,例如在S3中,只需在邮件中发送大文件的元数据(例如s3路径)即可.

Store the large payload data outside of the stream, e.g. in S3, and just send metadata of the large file (e.g. s3 path) in the messages.

使用哪种压缩取决于您的流生产者.更具体地说,它们支持哪些压缩算法.

Which compression to use is dependent on your stream producer. More specifically, which compression algorithms do they support.

但是最终,如果两个选项中的任何一个都不适合您,那么您可能需要考虑Kinesis不是适合该工作的工具.我认为Apache Kafka可以支持大于1MB的消息.

But ultimately if either of the two options is not suited for you, then you may need to consider that the Kinesis is not the right tool for the job. I think Apache Kafka can support larger messages than 1MB.

这篇关于我们如何将大型有效载荷压缩成运动流?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆