使用kafka生成Clickhouse数据 [英] Using kafka to produce data for clickhouse

查看:86
本文介绍了使用kafka生成Clickhouse数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想将kafka集成用于Clickhouse.我尝试使用!所有表已创建.我运行kafka服务器.接下来运行kafka producer并在数据库中像行中那样写入命令promt json对象.像这样:

I want to use kafka integration for clickhouse. I tried to use official tutorial like here! All table has been created. I run kafka server. Next run kafka producer and write in command promt json object like row in database. Like this:

{"timestamp":1554138000,"level":"first","message":"abc"}

我检查了kafka消费者.它收到了对象.但是,当我在Clickhouse数据库中检查表时,那里有空行.有任何想法我做错了吗?

I checked kafka consumer.It received object. But when I cheked tables in my clickhouse database there were empty rows. Any ideas what I did wrong?

推荐答案

更新

要忽略格式错误的邮件,请通过 kafka_skip_broken_messages-表定义的参数.

To ignore malformed messages pass kafka_skip_broken_messages-param to table definition.

它看起来像是众所周知的问题,它发生在最新的版本的CH,请尝试向引擎配置中添加额外的参数 kafka_row_delimiter :

It looks like a well-known issue that occurred in one of the latest version of CH, try to add extra parameter kafka_row_delimiter to engine configuration:

CREATE TABLE queue (
 timestamp UInt64,
 level String,
 message String
) 
ENGINE = Kafka SETTINGS
  kafka_broker_list = 'localhost:9092',
  kafka_topic_list = 'topic',
  kafka_group_name = 'group1',
  kafka_format = 'JSONEachRow',
  kafka_row_delimiter = '\n'
  kafka_skip_broken_messages = 1;

这篇关于使用kafka生成Clickhouse数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆