Django Celery Memory未发布 [英] Django Celery Memory not release

查看:91
本文介绍了Django Celery Memory未发布的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的django项目中,我具有以下依赖关系:

In my django project I have the following dependencies:

  • django==1.5.4
  • django-celery==3.1.9
  • amqp==1.4.3
  • kombu==3.0.14
  • librabbitmq==1.0.3 (as suggested by https://stackoverflow.com/a/17541942/1452356)

在dev_settings.py中:

In dev_settings.py:

DEBUG = False
BROKER_URL = "django://"
import djcelery
djcelery.setup_loader()
CELERYBEAT_SCHEDULER = "djcelery.schedulers.DatabaseScheduler"
CELERYD_CONCURRENCY = 2
# CELERYD_TASK_TIME_LIMIT = 10 

CELERYD_TASK_TIME_LIMIT如此处建议那样被注释 https://stackoverflow.com/a/17561747/1452356 以及建议的debug_toolbar

CELERYD_TASK_TIME_LIMIT is commented as suggested here https://stackoverflow.com/a/17561747/1452356 along with debug_toolbar as suggested by https://stackoverflow.com/a/19931261/1452356

我用以下命令在shell中启动我的工作者:

I start my worker in a shell with:

./manage.py celeryd --settings=dev_settings

然后我发送任务:

class ExempleTask(Task):

  def run(self, piProjectId):
    table = []
    for i in range(50000000):
      table.append(1)
    return None

使用Django命令:

Using a django command:

class Command(BaseCommand):

  def handle(self, *plArgs, **pdKwargs):
    loResult = ExempleTask.delay(1)
    loResult.get()
    return None

使用:

./manage.py purge_and_delete_test --settings=dev_settings

我通过以下方式监视内存使用情况:

I monitor the memory usage with:

watch -n 1 'ps ax  -o rss,user,command | sort -nr | grep celery |head -n 5'

每次调用该任务时,它都会增加celeryd/worker进程的内存消耗,与其中分配的数据量成正比...

Every time I call the task, it increase the memory consumption of the celeryd/worker process, proportionally to the amount of data allocated in it...

这似乎是一个常见问题(参见其他stackoverflow链接),但是即使有最新的依存关系,我也无法解决.

It seems like a common issue (c.f. others stackoverflow link), however I couldn't fix it, even with the latest dependencies.

谢谢.

推荐答案

这是Python和OS问题,不是真正的django或celery问题.不必太深入:

This is a Python and OS issue, not really a django or celery issue. Without getting too deep:

1)进程一旦从OS请求,将永远不会释放内存寻址空间.它从不说嘿,我在这里完成了,您可以把它找回来".在您给出的示例中,我希望过程大小会增长一段时间,然后稳定下来,可能会保持较高的基准.在分配示例之后,您可以调用gc接口以强制垃圾收集以查看

1) A process will never free memory addressing space once it has requested it from the OS. It never says "hey, I'm done here, you can have it back". In the example you've given, I'd expect the process size to grow for a while, and then stabilize, possibly at a high base line. After your example allocation, you might call the gc interface to force a garbage collect to see how

2)这通常不是问题,因为操作系统会因为您的进程停止访问已释放的地址空间而将未使用的页面调出页面.

2) This isn't usually a problem, because unused pages are paged out by the OS because your process stops accessing that address space that it has deallocated.

3)如果您的进程正在泄漏对象引用,这将是一个问题,这将阻止python进行垃圾回收以重新分配空间以供该进程以后再次使用,并要求您的进程要求更多操作系统的地址空间.在某个时候,操作系统会大声喊叫,并且会(可能)使用其oomkiller或类似机制杀死您的进程.

3) It is a problem if your process is leaking object references, preventing python from garbage collecting to re-appropriate the space for later reuse by that process, and requiring your process to ask for more address space from the OS. At some point, the OS cries uncle and will (probably) kill your process with its oomkiller or similar mechanism.

4)如果正在泄漏,请修复泄漏或设置CELERYD_MAX_TASKS_PER_CHILD,并且子进程将(可能)在操作系统崩溃之前自杀.

4) If you are leaking, either fix the leak or set CELERYD_MAX_TASKS_PER_CHILD, and your child processes will (probably) commit suicide before upsetting the OS.

这是关于Python内存管理的很好的一般讨论: CPython内存分配

This is a good general discussion on Python's memory management: CPython memory allocation

还有一些小事情: 使用xrange而不是range-range将生成所有值,然后在该列表上进行迭代. xrange只是一个生成器.设置了Django DEBUG = False吗?

And a few minor things: Use xrange not range - range will generate all values then iterate over that list. xrange is just a generator. Have set Django DEBUG=False?

这篇关于Django Celery Memory未发布的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆