硬记忆限制 [英] hard memory limits

查看:76
本文介绍了硬记忆限制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述




当我尝试构建一个包含200,000个元素的
列表时,我认为我已经达到python中的系统限制。我的错误是


malloc:vm_allocate(size = 2400256)失败......


只是想知道这是否特定于我的系统或者是什么?在这种情况下会增加更多

RAM吗?


谢谢和欢呼

莫里斯

Hi,

I think I''ve hit a system limit in python when I try to construct a list
of 200,000 elements. My error is

malloc: vm_allocate (size = 2400256) failed......

Just wondering is this specific to my system or what? Will adding more
RAM helps in this case?

Thanks and cheers
Maurice

推荐答案

Maurice LING写道:
Maurice LING wrote:
当我尝试构建列表时,我认为我在python中遇到了系统限制
200,000个元素。


Python中没有这样的限制。

我的错误是

malloc:vm_allocate(size = 2400256)失败......

只是想知道这是针对我的系统还是什么?


看起来不像Python错误(Python通常会在内存不足时引发
MemoryError异常),并且

在Python

来源中没有任何vm_allocate函数的迹象,所以是的,这是系统特定的问题。

将添加更多内存在这种情况下有帮助吗?
I think I''ve hit a system limit in python when I try to construct a list
of 200,000 elements.
there''s no such limit in Python.
My error is

malloc: vm_allocate (size = 2400256) failed......

Just wondering is this specific to my system or what?
that doesn''t look like a Python error (Python usually raises
MemoryError exceptions when it runs out of memory), and
there''s no sign of any vm_allocate function in the Python
sources, so yes, it''s a system-specific problem.
Will adding more RAM helps in this case?




可能。更多交换空间也可能有所帮助。或者您可以使用

更智能的malloc包。在您的平台上发布更多详细信息,

工具链,python版本和列表构建方法也可能

帮助。


(是你可能一块一块地建立多个列表,交错

与其他对象分配?如果是这样,它可能是一个碎片

问题。要检查这个,看看这个过程大小。如果以常规费率增长,然后在出现上述错误之前爆炸,那么你可能需要重新考虑一下设计。
/>

< / F>



probably. more swap space might also help. or you could use a
smarter malloc package. posting more details on your platform,
toolchain, python version, and list building approach might also
help.

(are you perhaps building multiple lists piece by piece, interleaved
with other object allocations? if so, it''s probably a fragmentation
problem. to check this, watch the process size. if if grows at a
regular rate, and then explodes just before you get the above error,
you may need to reconsider the design a bit).

</F>


" Fredrik Lundh" < FR ***** @ pythonware.com>写道:
"Fredrik Lundh" <fr*****@pythonware.com> writes:
Maurice LING写道:
Maurice LING wrote:
在这种情况下会增加更多RAM吗?
Will adding more RAM helps in this case?



可能。更多交换空间也可能有所帮助。或者您可以使用更智能的malloc包。在您的平台上发布更多详细信息,
工具链,python版本和列表构建方法也可能有帮助。



probably. more swap space might also help. or you could use a
smarter malloc package. posting more details on your platform,
toolchain, python version, and list building approach might also
help.




如果没有平台信息,它会'很难说。在现代的Unix

系统中,只有在系统负载很重的情况下才会遇到系统资源限制。否则,您将达到每个进程的限制。在

后一种情况下,添加RAM或交换根本没有帮助。提高每个进程限制

是解决方案。


< mike

-

Mike Meyer< mw*@mired.org> http://www.mired.org/home/mwm/

独立的WWW / Perforce / FreeBSD / Unix顾问,电子邮件以获取更多信息。



Without platform information, it''s hard to say. On a modern Unix
system, you only run into system resource limits when the system is
heavily loaded. Otherwise, you''re going to hit per-process limits. In
the latter case, adding RAM or swap won''t help at all. Raising the
per-process limits is the solution.

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.


2005年5月6日,Mike Meyer< mw * @ mired.org>写道:
On 5/6/05, Mike Meyer <mw*@mired.org> wrote:
" Fredrik Lundh" < FR ***** @ pythonware.com>写道:
"Fredrik Lundh" <fr*****@pythonware.com> writes:
Maurice LING写道:
Maurice LING wrote:
在这种情况下,添加更多RAM会有帮助吗?
可能。更多交换空间也可能有所帮助。或者您可以使用更智能的malloc包。在您的平台上发布更多详细信息,工具链,python版本和列表构建方法也可能有帮助。
Will adding more RAM helps in this case?
probably. more swap space might also help. or you could use a
smarter malloc package. posting more details on your platform,
toolchain, python version, and list building approach might also
help.



没有平台信息,很难说。在现代Unix
系统上,只有在系统负载很重时才会遇到系统资源限制。否则,您将达到每个进程的限制。在后一种情况下,添加RAM或交换根本没有帮助。提高每个进程的限制是解决方案。



Without platform information, it''s hard to say. On a modern Unix
system, you only run into system resource limits when the system is
heavily loaded. Otherwise, you''re going to hit per-process limits. In
the latter case, adding RAM or swap won''t help at all. Raising the
per-process limits is the solution.




一个快速谷歌显示它是mac os X,并且经常出错消息..

http://www.google.com/search?hl=en&q...=Google+Search


Peace

Bill Mill

bill.mill at gmail.com

< mike
-
Mike Meyer< mw*@mired.org> http://www.mired .org / home / mwm /
独立的WWW / Perforce / FreeBSD / Unix顾问,请发送电子邮件以获取更多信息。
-
http://mail.python.org/mailman/listinfo/python-list



这篇关于硬记忆限制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆