Kubernetes日志在Kibana中拆分 [英] Kubernetes logs split in kibana

查看:254
本文介绍了Kubernetes日志在Kibana中拆分的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在Azure中拥有Kubernetes系统,并使用以下委托来安装流利的,elasticsearch和kibana:

I have Kubernetes system in Azure and used the following instrustions to install fluent, elasticsearch and kibana: https://github.com/kubernetes/kubernetes/tree/master/cluster/addons/fluentd-elasticsearch I am able to see my pods logs in kibana but when i send logs more then 16k chars its just split.

如果我发送35k个字符.它分为3个日志.

if i send 35k chars . its split into 3 logs.

如何增加1个日志的限制?我希望能够一次查看36k个字符.

how can i increase the limit of 1 log? I want to able to see the 36k chars in one log.

此处的图片

推荐答案

https ://github.com/fluent-plugins-nursery/fluent-plugin-concat

将作业合并为一个日志. 解决docker的最大日志行(16Kb) 解决我的容器日志中的长行被分成多行 解决消息的最大大小似乎为16KB,因此对于85KB的消息,结果是在不同的块中创建了6条消息.

did the job combine to one log. solve docker's max log line (of 16Kb) solve long lines in my container logs get split into multiple lines solve max size of the message seems to be 16KB therefore for a message of 85KB the result is that 6 messages were created in different chunks.

这篇关于Kubernetes日志在Kibana中拆分的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆