RabbitMQ / Celery / Django内存泄漏? [英] RabbitMQ/Celery/Django Memory Leak?

查看:1166
本文介绍了RabbitMQ / Celery / Django内存泄漏?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我最近接管了我公司正在开展的项目的另一部分,并发现了我们的RabbitMQ / Celery设置中似乎是内存泄漏。



我们的系统有2Gb的内存,在任何给定的时间大约有1.8Gb的空闲。我们有多个任务可以收集大量的数据并将其添加到我们的数据库中。



当这些任务运行时,它们消耗了相当大量的内存,迅速萎缩了我们可用的记忆到16Mb到300Mb之间的任何地方。问题是,这些任务完成后,内存不会回来。



我们正在使用:




  • RabbitMQ v2.7.1

  • AMQP 0-9-1 / 0-9 / 0-8(从
    RabbitMQ startup_log获得此行)

  • 芹菜2.4.6

  • Django 1.3.1

  • amqplib 1.0.2

  • django-celery 2.4.2

  • kombu 2.1.0

  • Python 2.6.6

  • erlang 5.8



我们的服务器正在运行Debian 6.0.4。



我是新的这个设置,所以如果有任何其他信息,你需要可以帮助我确定这个问题来自哪里,请让我知道。



所有任务都有返回值,所有任务都有ignore_result = True,CELERY_IGNORE_RESULT设置为True。



非常感谢您的时间。



我目前的配置文件是:

  CELERY_TASK_RESULT_EXPIRES = 30 
CELERY_MAX_CACHED_RESULTS = 1
CELERY_RESU LT_BACKEND = False
CELERY_IGNORE_RESULT = True
BROKER_HOST ='localhost'
BROKER_PORT = 5672
BROKER_USER = c.celery.u
BROKER_PASSWORD = c.celery.p
BROKER_VHOST = c.celery.vhost


解决方案

我是几乎肯定你正在使用DEBUG = True运行这个设置,导致内存泄漏。



查看这篇文章:禁用Celery的Django调试



我会发布我的配置,以防万一有帮助。



settings.py



  djcelery.setup_loader()
BROKER_HOST =localhost
BROKER_PORT = 5672
BROKER_VHOST =rabbit
BROKER_USER =YYYYYY
BROKER_PASSWORD =XXXXXXX

CELERY_IGNORE_RESULT = True
CELERY_DISABLE_RATE_LIMITS = True
CELERY_ACKS_LATE = True
CELERYD_PREFETCH_MULTIPLIER = 1
CELERYBEAT_SCHEDULER = djcelery.schedulers.DatabaseScheduler
CELERY_ROUTES =('FILE_WITH_ROUTES',)


I recently took over another part of the project that my company is working on and have discovered what seems to be a memory leak in our RabbitMQ/Celery setup.

Our system has 2Gb of memory, with roughly 1.8Gb free at any given time. We have multiple tasks that crunch large amounts of data and add them to our database.

When these tasks run, they consume a rather large amount of memory, quickly plummeting our available memory to anywhere between 16Mb and 300Mb. The problem is, after these tasks finish, the memory does not come back.

We're using:

  • RabbitMQ v2.7.1
  • AMQP 0-9-1 / 0-9 / 0-8 (got this line from the RabbitMQ startup_log)
  • Celery 2.4.6
  • Django 1.3.1
  • amqplib 1.0.2
  • django-celery 2.4.2
  • kombu 2.1.0
  • Python 2.6.6
  • erlang 5.8

Our server is running Debian 6.0.4.

I am new to this setup, so if there is any other information you need that could help me determine where this problem is coming from, please let me know.

All tasks have return values, all tasks have ignore_result=True, CELERY_IGNORE_RESULT is set to True.

Thank you very much for your time.

My current config file is:

CELERY_TASK_RESULT_EXPIRES = 30
CELERY_MAX_CACHED_RESULTS = 1
CELERY_RESULT_BACKEND = False
CELERY_IGNORE_RESULT = True
BROKER_HOST = 'localhost'
BROKER_PORT = 5672
BROKER_USER = c.celery.u
BROKER_PASSWORD = c.celery.p
BROKER_VHOST = c.celery.vhost

解决方案

I am almost certain you are running this setup with DEBUG=True wich leads to a memory leak.

Check this post: Disable Django Debugging for Celery.

I'll post my configuration in case it helps.

settings.py

djcelery.setup_loader()
BROKER_HOST = "localhost"
BROKER_PORT = 5672
BROKER_VHOST = "rabbit"
BROKER_USER = "YYYYYY"
BROKER_PASSWORD = "XXXXXXX"

CELERY_IGNORE_RESULT = True
CELERY_DISABLE_RATE_LIMITS = True
CELERY_ACKS_LATE = True
CELERYD_PREFETCH_MULTIPLIER = 1
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERY_ROUTES = ('FILE_WITH_ROUTES',)

这篇关于RabbitMQ / Celery / Django内存泄漏?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆