如何在 Kafka-Connect API 中设置 max.poll.records [英] How to set max.poll.records in Kafka-Connect API
问题描述
我正在使用 confluent-3.0.1 平台并构建 Kafka-Elasticsearch 连接器.为此,我扩展了 SinkConnector 和 SinkTask(Kafka 连接 API)以从 Kafka 获取数据.
I am using confluent-3.0.1 platform and building a Kafka-Elasticsearch connector. For this I am extending SinkConnector and SinkTask (Kafka-connect APIs) to get data from Kafka.
作为此代码的一部分,我扩展了 SinkConnector 的 taskConfigs 方法以返回max.poll.records"以一次仅获取 100 条记录.但它不起作用,我同时获取所有记录,但我没有在规定的时间内提交偏移量.请任何人帮我配置max.poll.records"
As part of this code i am extending taskConfigs method of SinkConnector to return "max.poll.records" to fetch only 100 records at a time. But its not working and I am getting all records at same time and I am failing to commit offsets within the stipulated time. Please can any one help me to configure "max.poll.records"
public List<Map<String, String>> taskConfigs(int maxTasks) {
ArrayList<Map<String, String>> configs = new ArrayList<Map<String, String>>();
for (int i = 0; i < maxTasks; i++) {
Map<String, String> config = new HashMap<String, String>();
config.put(ConfigurationConstants.CLUSTER_NAME, clusterName);
config.put(ConfigurationConstants.HOSTS, hosts);
config.put(ConfigurationConstants.BULK_SIZE, bulkSize);
config.put(ConfigurationConstants.IDS, elasticSearchIds);
config.put(ConfigurationConstants.TOPICS_SATELLITE_DATA, topics);
config.put(ConfigurationConstants.PUBLISH_TOPIC, topicTopublish);
config.put(ConfigurationConstants.TYPES, elasticSearchTypes);
config.put("max.poll.records", "100");
configs.add(config);
}
return configs;
}
推荐答案
您不能在连接器配置中覆盖大多数 Kafka 消费者配置,例如 max.poll.records
.不过,您可以在 Connect worker 配置中使用 consumer.
前缀.
You can't override most Kafka consumer configs like max.poll.records
in the connector configuration. You can do so in the Connect worker configuration though, with a consumer.
prefix.
这篇关于如何在 Kafka-Connect API 中设置 max.poll.records的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!