Flink 日志限制:如何将日志配置传递给 flink 作业 [英] Flink logging limitation: How to pass logging configuration to a flink job

查看:127
本文介绍了Flink 日志限制:如何将日志配置传递给 flink 作业的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 flink 作业,它使用 logback 作为日志框架,因为日志需要发送到 logstash,而 logback 有一个 logstash appender (logstash-logback-appender).appender 工作正常,当 flink 作业从 Eclipse 等 IDE 运行时,我可以在 logstash 中看到应用程序日志.日志配置文件 logback.xml 放在 src/main/resources 中,并包含在类路径中.即使从 IDE 外部的命令行运行作业,日志记录也能正常工作.

但是,当我通过 flink 仪表板在 flink 集群(独立,开始使用 ./start-cluster.bat)上部署此作业时,logback 配置被忽略并且日志不会发送到 logstash.

我阅读了有关 flink 日志记录机制的更多信息,并遇到了 有关配置 logback 的文档.本文档中提到的步骤与一些附加步骤配合良好,例如在 lib/文件夹中添加 logstash-logback-encoder lib 和 logback jar.

即使上述步骤有效,这也是有问题的,因为 flink 使用的 flink/conf 文件夹中的 logback 配置适用于整个 flink 设置和在 flink 上运行的所有作业.作业不能有自己的日志记录配置.例如.我希望作业 1 写入 fileconsolelogstash 和作业 2 只写入 file.

如何为每个从仪表板启动的 flink 作业提供单独的日志记录配置?在仪表板上提交作业时,是否可以通过任何方式传递日志记录配置?

有没有办法强制 flink 在类路径上使用日志配置?

解决方案

Flink 目前不支持为每个作业指定单独的日志配置.日志配置对整个集群始终有效.

解决此问题的一种方法是以每个作业模式启动作业.这意味着您为每个 Flink 作业启动一个专用的 Flink 集群.

bin/flink run -m yarn-cluster -p 2 MyJobJar.jar

I have a flink job which uses logback as the logging framework since the logs needs to be sent to logstash and logback has a logstash appender (logstash-logback-appender). The appender works fine and i can see the application logs in logstash when the flink job is run from an IDE like Eclipse. The logging configuration file logback.xml is placed in src/main/resources and gets included on the classpath. The logging works fine even when running the job from command line outside the IDE.

However when i deploy this job on flink cluster(standalone, started using ./start-cluster.bat) through the flink dashboard, the logback configuration is ignored and the logs are not sent to logstash.

I read up more about flink's logging mechanism and came across documentation on configuring logback. The steps mentioned in this documentation works fine with some additional steps like adding logstash-logback-encoder lib in the lib/ folder along with logback jars.

Even though the steps mentioned above works this is problematic since the logback configuration in flink/conf folder which is used by flink, applies to the entire flink setup and all the jobs running on flink. The jobs cannot have their own logging configuration. For eg. i want job1 to write to file,console,logstash and job 2 to write to only file.

How can each flink job that is started from the dashboard be supplied with seperate logging configuration? Is there any way logging configuration can be passed while submitting the job on dashboard?

Is there someway to force flink to use logging configuration on the classpath?

解决方案

Flink currently does not support to specify individual logging configurations per job. The logging configuration is always valid for the whole cluster.

A way to solve this problem is to start the jobs in per-job mode. This means that you start for every Flink job a dedicated Flink cluster.

bin/flink run -m yarn-cluster -p 2 MyJobJar.jar

这篇关于Flink 日志限制:如何将日志配置传递给 flink 作业的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆