消费者重复事件 [英] duplicate events by consumer

查看:52
本文介绍了消费者重复事件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们观察到,一位消费者尝试从kafka主题中多次选择事件.我们在消费者应用程序方面存在以下问题. spring.kafka.consumer.enable-auto-commit = false&spring.kafka.consumer.auto-offset-reset =最早.如何避免使用者应用程序重复.我们是否需要微调以上配置设置,以避免用户从kafka主题中多次选择事件.

we observed that one of the consumer try to pick the events multiple times from kafka topic. we have the below seetings on consumer application side. spring.kafka.consumer.enable-auto-commit=false & spring.kafka.consumer.auto-offset-reset=earliest. how to avoid the duplicate by the consumer application. Do we need to fine tune the above configuration settings to avoid the consumer to pick the events multiple times from the kafka topic.

推荐答案

由于您已禁用自动提交,因此您在实际提交记录时确实需要进行微调,否则您可以至少一次 处理.

Since you've disabled auto commits, you do need to fine tune when you actually commit a record, otherwise you could have at least once processing.

您还可以使用事务和幂等生成器来阅读仅一次处理功能的示例

You could also read the examples of the exactly once processing capabilities using transactions and idempotent producers

auto.offset.reset仅在您的使用者组已删除或根本不存在(您未提交任何内容)时适用.在这种情况下,您总是要从主题的开头开始阅读

The auto.offset.reset only applies if your consumer group is removed, or never exists at all (you're not committing anything). In that case, you're always going to read from the beginning of the topic

这篇关于消费者重复事件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆