具有大量 RAM 的 Python 2.7 MemoryError(64 位,Ubuntu) [英] Python 2.7 MemoryError (64bit, Ubuntu) with plenty of RAM

查看:35
本文介绍了具有大量 RAM 的 Python 2.7 MemoryError(64 位,Ubuntu)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

Python 2.7.10(通过 conda)在 60GB RAM 的 Ubuntu 14.04 上.

Python 2.7.10 (via conda) on Ubuntu 14.04 with 60GB RAM.

在 IPython 笔记本中处理大型数据集.即使我对顶级"信息的阅读还有很多 GB 可供进程增长,但仍会出现 MemoryErrors.以下是顶部"的代表性摘录:

Working with large datasets in IPython notebooks. Getting MemoryErrors even though my reading of 'top' info is there are many GB left for the process to grow into. Here's a representative excerpt from 'top':

KiB Mem:  61836572 total, 61076424 used,   760148 free,     2788 buffers
KiB Swap:        0 total,        0 used,        0 free. 31823408 cached Mem

   PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                                                                                                                                                                                  
 81176 ubuntu    20   0 19.735g 0.017t   3848 R 100.9 30.3  12:48.89 /home/ubuntu/miniconda/envs/ds_notebook/bin/python -m ipykernel -f /run/user/1000/jupyter/kernel-4c9c1a51-da60-457b-b55e-faadf9ae06fd.json                                              
 80702 ubuntu    20   0 11.144g 9.295g      8 S   0.0 15.8   1:27.28 /home/ubuntu/miniconda/envs/ds_notebook/bin/python -m ipykernel -f /run/user/1000/jupyter/kernel-1027385c-f5e2-42d9-a5f0-7d837a39bdfe.json                                               

所以这两个进程使用了​​刚刚超过 30GB 的地址空间和大约 26GB 的常驻空间.(所有其他进程都很小.)

So those two processes are using just over 30GB address-space, and about 26GB resident space. (All other processes are tiny.)

我的理解(以及许多在线资源)暗示,程序可以在需要时(从缓存中)拉回大约 31GB 的缓存"总量.(free -m 的输出也在 buffers/cache 中显示 30+GB.)

My understanding (and many online sources) imply that 'cached' total of ~31GB is available to be pulled back (from caching) by programs when needed. (Output of free -m shows 30+GB in buffers/cache as well.)

然而,Python 无法分配仅几 GB 的新结构.

And yet, Python is failing to allocate new structures of just a couple GB.

Python资源"模块报告的所有限制都未设置.

All the limits reported by the Python 'resource' module appear unset.

为什么 Python 进程不再获得(或被给予)任何可用的地址空间和物理内存?

Why won't the Python process take (or be given) any more free address space and physical memory?

推荐答案

也许不是答案,我们需要更多的调查和信息,了解您的具体操作以及您的配置,但是:您的可用空间少于 1 GB (760Mo),但缓存了 31Giga.因此,由于内存碎片,可能没有更多的内存可供分配.我想所有缓存的内存都是一些先前加载/释放数据留下/释放的内存.也许经过一些工作,碎片禁止分配这么大的内存.没有交换,这是一个真正的问题.

Maybe not the answer, we need more investigation and information of what you do exactly and what is your config, but : You have less then one GB free (760Mo), but 31Giga cached. So, it's possible that there is no more memory to allocate because of memory fragmentation. I suppose that all cached memory is some memory left/release by some previous load/free of data. And maybe after some works, the fragmentation forbid allocation of a such big piece of memory. And with no swap, this is a true problem.

这篇关于具有大量 RAM 的 Python 2.7 MemoryError(64 位,Ubuntu)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆