如何在Perl CGI脚本中产生长时间运行的进程? [英] How can I spawn a long running process in a Perl CGI script?

查看:120
本文介绍了如何在Perl CGI脚本中产生长时间运行的进程?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我现在正在编写Perl CGI脚本,但是它正在成为资源消耗,并且由于我不断达到进程内存限制而一直被我的Web主机杀死.我想知道是否可以将我拥有的脚本拆分为多个脚本,然后让第一个脚本调用下一个脚本,然后退出,这样整个脚本不会立即存储在内存中.我看到有一个导出器模块,但是我在学习Perl时还不知道如何使用它,并且我认为这不会解决我的内存问题,但是我可能错了.

I'm writing a Perl CGI script right now but it's becoming a resource hog and it keeps getting killed by my web host because I keep hitting my process memory limit. I was wondering if there is a way I can split the script I have into multiple scripts and then have the first script call the next script then exit so the entire script isn't in memory at once. I saw there is an exporter module but I don't know how to use it yet as I'm just learning Perl, and I don't think that will solve my memory problem but I might be wrong.

推荐答案

请参见长时间观看通过CGI处理.

另一方面,仅更好地管理内存也可能会解决您的问题.例如,如果您要一次将整个文件读取到内存中,请尝试编写脚本,以使其逐行或以固定大小的块处理数据.在尽可能小的范围内声明变量.

On the other hand, just managing memory better might also solve your problem. For example, if you are reading entire files into memory at once, try to write the script so that it handles data line-by-line or in fixed sized chunks. Declare your variables in the smallest possible scope.

尝试确定脚本的哪个部分正在创建最大的内存占用空间,并将相关摘录张贴在一个单独的问题中,以获取更多内存管理建议.

Try to identify what part of your script is creating the largest memory footprint and post the relevant excerpt in a separate question for more memory management suggestions.

这篇关于如何在Perl CGI脚本中产生长时间运行的进程?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆