如何使用Kafka(超过15MB)发送大邮件? [英] How can I send large messages with Kafka (over 15MB)?

查看:437
本文介绍了如何使用Kafka(超过15MB)发送大邮件?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我使用Java Producer API向Kafka V. 0.8发送String-messages。
如果邮件大小约为15 MB,我会收到 MessageSizeTooLargeException
我试图将 message.max.bytes 设置为40 MB,但我仍然得到例外。小消息没有问题。

I send String-messages to Kafka V. 0.8 with the Java Producer API. If the message size is about 15 MB I get a MessageSizeTooLargeException. I have tried to set message.max.bytesto 40 MB, but I still get the exception. Small messages worked without problems.

(生成器中出现异常,我在此应用程序中没有消费者。)

(The exception appear in the producer, I don't have a consumer in this application.)

我可以做些什么来消除这个异常?

What can I do to get rid of this exception?

private ProducerConfig kafkaConfig() {
    Properties props = new Properties();
    props.put("metadata.broker.list", BROKERS);
    props.put("serializer.class", "kafka.serializer.StringEncoder");
    props.put("request.required.acks", "1");
    props.put("message.max.bytes", "" + 1024 * 1024 * 40);
    return new ProducerConfig(props);
}



错误日志:



Error-Log:

4709 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with correlation id 214 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
4869 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with    correlation id 217 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5035 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with   correlation id 220 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5198 [main] WARN  kafka.producer.async.DefaultEventHandler  - Produce request with correlation id 223 failed due to [datasift,0]: kafka.common.MessageSizeTooLargeException
5305 [main] ERROR kafka.producer.async.DefaultEventHandler  - Failed to send requests for topics datasift with correlation ids in [213,224]

kafka.common.FailedToSendMessageException: Failed to send messages after 3 tries.
at kafka.producer.async.DefaultEventHandler.handle(Unknown Source)
at kafka.producer.Producer.send(Unknown Source)
at kafka.javaapi.producer.Producer.send(Unknown Source)


推荐答案

你需要调整三(或四)属性:

You need to adjust three (or four) properties:


  • 消费者方: fetch.message.max.bytes - 这将确定消费者可以获取的消息的最大大小。

  • 经纪人方: replica.fetch.max.bytes - 这将允许代理中的副本在群集内发送消息,并确保正确复制消息。如果这个太小,那么消息永远不会被复制,因此,消费者永远不会看到消息,因为消息永远不会被提交(完全复制)。

  • 经纪人方: message.max.bytes - 这是代理商可以从生产者处收到的最大邮件大小。

  • Broker side(每个主题): max.message.bytes - 这是代理允许附加到主题的消息的最大大小。此大小经过预压缩验证。 (默认为经纪人的 message.max.bytes 。)

  • Consumer side:fetch.message.max.bytes - this will determine the largest size of a message that can be fetched by the consumer.
  • Broker side: replica.fetch.max.bytes - this will allow for the replicas in the brokers to send messages within the cluster and make sure the messages are replicated correctly. If this is too small, then the message will never be replicated, and therefore, the consumer will never see the message because the message will never be committed (fully replicated).
  • Broker side: message.max.bytes - this is the largest size of the message that can be received by the broker from a producer.
  • Broker side (per topic): max.message.bytes - this is the largest size of the message the broker will allow to be appended to the topic. This size is validated pre-compression. (Defaults to broker's message.max.bytes.)

我发现了关于2号的困难方法 - 你没有从Kafka那里得到任何例外,消息或警告,所以在你发送大量消息时一定要考虑这个。

I found out the hard way about number 2 - you don't get ANY exceptions, messages, or warnings from Kafka, so be sure to consider this when you are sending large messages.

这篇关于如何使用Kafka(超过15MB)发送大邮件?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆