将所有芹菜任务的日志消息发送到单个文件 [英] Send log messages from all celery tasks to a single file

查看:122
本文介绍了将所有芹菜任务的日志消息发送到单个文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想知道如何设置更具体的日志记录系统.我所有的任务都使用

I'm wondering how to setup a more specific logging system. All my tasks use

logger = logging.getLogger(__name__)

作为模块范围的记录器.

as a module-wide logger.

我希望celery登录到"celeryd.log",而我的任务登录到"tasks.log",但我不知道如何使它工作.使用django-celery中的CELERYD_LOG_FILE,我可以将所有与celeryd相关的日志消息路由到celeryd.log,但是没有跟踪任务中创建的日志消息.

I want celery to log to "celeryd.log" and my tasks to "tasks.log" but I got no idea how to get this working. Using CELERYD_LOG_FILE from django-celery I can route all celeryd related log messages to celeryd.log but there is no trace of the log messages created in my tasks.

推荐答案

注意:此答案从Celery 3.0开始已经过时,在此您现在使用"Celery 3.0新增功能"的日志记录"部分以获得详细信息.

Note: This answer is outdated as of Celery 3.0, where you now use get_task_logger() to get your per-task logger set up. Please see the Logging section of the What's new in Celery 3.0 document for details.

Celery对每个任务的日志记录提供了专门的支持.请参阅关于此主题的任务文档:

Celery has dedicated support for logging, per task. See the Task documentation on the subject:

您可以使用worker记录器将诊断输出添加到worker日志中:

You can use the workers logger to add diagnostic output to the worker log:

@celery.task()
def add(x, y):
    logger = add.get_logger()
    logger.info("Adding %s + %s" % (x, y))
    return x + y

有几个可用的日志记录级别,并且worker日志级别设置决定 是否将它们写入日志文件.

There are several logging levels available, and the workers loglevel setting decides whether or not they will be written to the log file.

当然,您也可以简单地使用print,因为任何写入标准out/-err的内容都将是 也写入日志文件.

Of course, you can also simply use print as anything written to standard out/-err will be written to the log file as well.

实际上,这仍然是标准的python日志记录模块.您可以设置 CELERYD_HIJACK_ROOT_LOGGER选项设置为False即可使用您自己的日志记录设置,否则Celery将为您配置处理方式.

Under the hood this is all still the standard python logging module. You can set the CELERYD_HIJACK_ROOT_LOGGER option to False to allow your own logging setup to work, otherwise Celery will configure the handling for you.

但是,对于任务,通过.get_logger()调用确实可以为每个任务设置单独的日志文件.只需传入logfile参数,它将日志消息路由到该单独的文件:

However, for tasks, the .get_logger() call does allow you to set up a separate log file per individual task. Simply pass in a logfile argument and it'll route log messages to that separate file:

@celery.task()
def add(x, y):
    logger = add.get_logger(logfile='tasks.log')
    logger.info("Adding %s + %s" % (x, y))
    return x + y 

最后但并非最不重要的一点是,您可以在 python日志记录模块中配置顶级程序包并为其提供文件处理程序.我将使用celery.signals.after_setup_task_logger信号进行设置;在这里,我假设您所有的模块都生活在名为foo.tasks的程序包中(如foo.tasks.emailfoo.tasks.scaling一样):

Last but not least, you can just configure your top-level package in the python logging module and give it a file handler of it's own. I'd set this up using the celery.signals.after_setup_task_logger signal; here I assume all your modules live in a package called foo.tasks (as in foo.tasks.email and foo.tasks.scaling):

from celery.signals import after_setup_task_logger
import logging

def foo_tasks_setup_logging(**kw):
    logger = logging.getLogger('foo.tasks')
    if not logger.handlers:
        handler = logging.FileHandler('tasks.log')
        formatter = logging.Formatter(logging.BASIC_FORMAT) # you may want to customize this.
        handler.setFormatter(formatter)
        logger.addHandler(handler)
        logger.propagate = False

after_setup_task_logger.connect(foo_tasks_setup_logging)

现在,任何名称以foo.tasks开头的记录器都将把其所有消息发送到tasks.log而不是发送到根记录器(由于.propagate为False,因此看不到任何这些消息).

Now any logger whose name starts with foo.tasks will have all it's messages sent to tasks.log instead of to the root logger (which doesn't see any of these messages because .propagate is False).

这篇关于将所有芹菜任务的日志消息发送到单个文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆