如何让消费者从Kafka请求超过1MB的记录 [英] How can I make consumer request more than 1MB records from Kafka

查看:558
本文介绍了如何让消费者从Kafka请求超过1MB的记录的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

每当我的消费者从Kafka请求新批次时,它总是要求1MB的数据,然后似乎要求下一个1MB的数据,依此类推.有人知道要接收20MB批处理的配置和编程步骤是什么吗?

Whenever my consumer requests a new batch from Kafka, it is requesting always 1MB of data, then it seems to request the next 1MB, and so forth. Does anybody know what the configuration and programming steps are to to receive batches of 20MB?

推荐答案

您可以将使用者属性中的属性max.partition.fetch.bytes设置为所需的值(默认值为1MB).

You can set the property max.partition.fetch.bytes in the consumer properties to the value you desire (default is 1MB).

此外,此值必须等于或大于代理配置中的max.message.size属性,以确保您的使用者能够读取代理接受的所有消息.

Also, this value must be equal or greater than max.message.size property in the broker configuration to be sure that your consumers will be able to read all messages accepted by the broker.

最后,如果处理20MB所需的时间太长,则可能需要增加使用者的session.timeout.ms设置(默认为3秒),以避免经纪人认为您的使用者已死并触发重新平衡.

Finally, if processing 20MB takes too long, you may want to increase your session.timeout.ms setting at the consumer (defaults to 3 seconds), to avoid the broker thinking your consumer is dead and triggering a rebalance.

这篇关于如何让消费者从Kafka请求超过1MB的记录的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆