Flink日志记录限制:如何将日志记录配置传递给Flink作业 [英] Flink logging limitation: How to pass logging configuration to a flink job

查看:68
本文介绍了Flink日志记录限制:如何将日志记录配置传递给Flink作业的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个flink作业,它使用logback作为日志框架,因为需要将日志发送到logstash,并且logback具有logstash附加程序( logstash-logback-appender ).当从诸如Eclipse之类的IDE运行flink作业时,附加程序运行良好,并且可以在logstash中看到应用程序日志.日志记录配置文件 logback.xml 放在 src/main/resources 中,并包含在类路径中.即使从IDE外部的命令行运行作业,日志记录也可以正常工作.

但是,当我通过flink仪表板在flink群集(独立,使用 ./start-cluster.bat 开始)上部署此作业时,将忽略logback配置,并且日志不会发送到 logstash .

我详细了解了flink的日志记录机制,并遇到了有关配置登录的文档.本文档中提到的步骤可以与其他一些步骤配合使用,例如将 logstash-logback-encoder lib与 logback jar一起添加到lib/文件夹中.

即使上述步骤可行,这也是有问题的,因为flink使用的 flink/conf 文件夹中的logback配置适用于整个flink设置以及在flink上运行的所有作业.作业不能具有自己的日志记录配置.例如.我希望job1写入 file console logstash 和job 2仅写入 file .

如何从仪表板启动的每个flink作业都提供单独的日志记录配置?在仪表板上提交作业时,可以通过任何方式传递日志记录配置吗?

是否有某种方法可以迫使flink在类路径上使用日志记录配置?

解决方案

Flink当前不支持为每个作业指定单独的日志记录配置.日志记录配置对于整个集群始终有效.

解决此问题的一种方法是在每个作业模式下启动作业.这意味着您将为每个Flink作业启动一个专用的Flink群集.

  bin/flink run -m yarn-cluster -p 2 MyJobJar.jar 

I have a flink job which uses logback as the logging framework since the logs needs to be sent to logstash and logback has a logstash appender (logstash-logback-appender). The appender works fine and i can see the application logs in logstash when the flink job is run from an IDE like Eclipse. The logging configuration file logback.xml is placed in src/main/resources and gets included on the classpath. The logging works fine even when running the job from command line outside the IDE.

However when i deploy this job on flink cluster(standalone, started using ./start-cluster.bat) through the flink dashboard, the logback configuration is ignored and the logs are not sent to logstash.

I read up more about flink's logging mechanism and came across documentation on configuring logback. The steps mentioned in this documentation works fine with some additional steps like adding logstash-logback-encoder lib in the lib/ folder along with logback jars.

Even though the steps mentioned above works this is problematic since the logback configuration in flink/conf folder which is used by flink, applies to the entire flink setup and all the jobs running on flink. The jobs cannot have their own logging configuration. For eg. i want job1 to write to file,console,logstash and job 2 to write to only file.

How can each flink job that is started from the dashboard be supplied with seperate logging configuration? Is there any way logging configuration can be passed while submitting the job on dashboard?

Is there someway to force flink to use logging configuration on the classpath?

解决方案

Flink currently does not support to specify individual logging configurations per job. The logging configuration is always valid for the whole cluster.

A way to solve this problem is to start the jobs in per-job mode. This means that you start for every Flink job a dedicated Flink cluster.

bin/flink run -m yarn-cluster -p 2 MyJobJar.jar

这篇关于Flink日志记录限制:如何将日志记录配置传递给Flink作业的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆