如何自动终止使用Python占用过多内存的进程? [英] How do I automatically kill a process that uses too much memory with Python?

查看:491
本文介绍了如何自动终止使用Python占用过多内存的进程?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

情况:我有一个网站,允许人们使用Python解释器在另一种语言上执行任意代码 (特别是我创建的esolang).共享主机服务器.我在一个单独的过程中运行此代码,该过程的时间限制为60秒.

The situation: I have a website that allows people to execute arbitrary code in a different language (specifically, an esolang I created), using a Python interpreter on a shared-hosting server. I run this code in a separate process which is given a time limit of 60 seconds.

问题:您可以执行类似(Python等效)的10**(10**10)之类的操作,该操作迅速消耗的内存远远超过我分配给我的内存.显然,它也锁定了Apache,或者响应时间太长,因此我必须重新启动它.

The problem: You can do stuff like (Python equivalent) 10**(10**10), which rapidly consumes far more memory than I have allotted to me. It also, apparently, locks up Apache - or it takes too long to respond - so I have to restart it.

我已经看到此问题,但给出的答案使用的是Perl,我一点也不知道,因此我想用Python给出答案.不过,操作系统也是 Linux.

I have seen this question, but the given answer uses Perl, which I do not know at all, hence I'd like an answer in Python. The OS is Linux too, though.

具体来说,我想要以下特征:

Specifically, I want the following characteristics:

  1. 自动运行
  2. 强制杀死超过某些内存限制(例如1MB或100MB)的任何进程
  3. 杀死由我的代码产生的超过24小时的所有进程

我使用这段代码(在Django视图中)创建并运行该过程(proxy_prgm是Manager,因此我可以从解释esolang代码的程序中检索数据)

I use this piece of code (in a Django view) to create the process and run it (proxy_prgm is a Manager so I can retrieve data from the program that's interpreting the esolang code):

prgmT[uid] = multiprocessing.Process(
    target = proxy_prgm.runCatch,
    args = (steps,),
    name="program run")

prgmT[uid].start()
prgmT[uid].join(60) #time limit of 1 minute

if prgmT[uid].is_alive():
    prgmT[uid].terminate()
    proxy_prgm.stop()

如果您需要更多详细信息,请不要犹豫告诉我要编辑的内容(或问我问题).

If you need more details, don't hesitate to tell me what to edit in (or ask me questions).

推荐答案

另一种可行的方法;使用 resource.setrlimit()(更多详细信息在另一个StackOverflow答案中).这样做似乎可以为进程及其子进程设置内存限制.不过,您必须弄清楚如果达到限制,该如何处理.我没有使用它的个人经验,但希望这样做会阻止Apache锁定您.

Another approach that might work; using resource.setrlimit() (more details in this other StackOverflow answer). It seems that by doing so you can set a memory limit on a process and it's subprocesses; you'll have to figure out how to handle if the limit is hit though. I don't have personal experience using it, but hopefully doing so would stop Apache from locking up on you.

这篇关于如何自动终止使用Python占用过多内存的进程?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆