如何使用 Kafka 发送大消息(超过 15MB)? [英] How can I send large messages with Kafka (over 15MB)?

查看:28
本文介绍了如何使用 Kafka 发送大消息(超过 15MB)?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用 Java Producer API 向 Kafka V. 0.8 发送字符串消息.如果消息大小约为 15 MB,我会收到 MessageSizeTooLargeException.我已尝试将 message.max.bytes 设置为 40 MB,但仍然出现异常.小消息工作没有问题.

I send String-messages to Kafka V. 0.8 with the Java Producer API. If the message size is about 15 MB I get a MessageSizeTooLargeException. I have tried to set message.max.bytesto 40 MB, but I still get the exception. Small messages worked without problems.

(异常出现在生产者中,我在这个应用程序中没有消费者.)

(The exception appear in the producer, I don't have a consumer in this application.)

我该怎么做才能摆脱这个异常?

What can I do to get rid of this exception?

private ProducerConfig kafkaConfig() {
    Properties props = new Properties();
    props.put("metadata.broker.list", BROKERS);
    props.put("serializer.class", "kafka.serializer.StringEncoder");
    props.put("request.required.acks", "1");
    props.put("message.max.bytes", "" + 1024 * 1024 * 40);
    return new ProducerConfig(props);
}

错误日志:

4709 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with correlation id 214 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
4869 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with    correlation id 217 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5035 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with   correlation id 220 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5198 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with correlation id 223 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5305 [main] ERROR kafka.producer.async.DefaultEventHandler  - Failed to send requests for topics datasift with correlation ids in [213,224]

kafka.common.FailedToSendMessageException: Failed to send messages after 3 tries.
at kafka.producer.async.DefaultEventHandler.handle(Unknown Source)
at kafka.producer.Producer.send(Unknown Source)
at kafka.javaapi.producer.Producer.send(Unknown Source)

推荐答案

需要调整三个(或四个)属性:

You need to adjust three (or four) properties:

  • 消费者端:fetch.message.max.bytes - 这将决定消费者可以获取的最大消息大小.
  • 代理端:replica.fetch.max.bytes - 这将允许代理中的副本在集群内发送消息并确保消息被正确复制.如果这太小,则消息将永远不会被复制,因此消费者永远不会看到该消息,因为该消息永远不会被提交(完全复制).
  • 代理端:message.max.bytes - 这是代理可以从生产者接收的最大消息大小.
  • 代理端(每个主题):max.message.bytes - 这是代理允许附加到主题的最大消息大小.这个大小是经过验证的预压缩.(默认为代理的 message.max.bytes.)
  • Consumer side:fetch.message.max.bytes - this will determine the largest size of a message that can be fetched by the consumer.
  • Broker side: replica.fetch.max.bytes - this will allow for the replicas in the brokers to send messages within the cluster and make sure the messages are replicated correctly. If this is too small, then the message will never be replicated, and therefore, the consumer will never see the message because the message will never be committed (fully replicated).
  • Broker side: message.max.bytes - this is the largest size of the message that can be received by the broker from a producer.
  • Broker side (per topic): max.message.bytes - this is the largest size of the message the broker will allow to be appended to the topic. This size is validated pre-compression. (Defaults to broker's message.max.bytes.)

我发现了第 2 条的艰难方法 - 您不会从 Kafka 收到任何异常、消息或警告,因此在发送大消息时一定要考虑这一点.

I found out the hard way about number 2 - you don't get ANY exceptions, messages, or warnings from Kafka, so be sure to consider this when you are sending large messages.

这篇关于如何使用 Kafka 发送大消息(超过 15MB)?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆