R中的高效内存管理 [英] Efficient memory management in R
问题描述
我的机器(Windows 7 Pro 64位)中有6 GB的内存,而在R中,我得到了
I have 6 GB memory in my machine (Windows 7 Pro 64 bit) and in R, I get
> memory.limit()
6141
当然,在处理大数据时,会发生内存分配错误.因此,为了使R使用虚拟内存,我使用
Of course, when dealing with big data, memory allocation error occurs. So in order to make R to use virtual memory, I use
> memory.limit(50000)
现在,在运行脚本时,我不再有内存分配错误,但是R占用了计算机中的所有内存,因此在脚本完成之前我无法使用计算机.我想知道是否有更好的方法来使R管理机器的内存.我认为它可以做的就是使用虚拟内存,如果它使用的物理内存超过了用户指定的数量.有这样的选择吗?
Now, when running my script, I don't have memory allocation error any more, but R hogs all the memory in my computer so I can't use the machine until the script is finished. I wonder if there is a better way to make R manage memory of the machine. I think something it can do is to use virtual memory if it is using physical memory more than user specified. Is there any option like that?
推荐答案
查看ff和bigmemory软件包.它使用知道R对象的函数将它们保存在磁盘上,而不是让OS(它只知道内存的块,但不知道它们代表什么).
Look at the ff and bigmemory packages. This uses functions that know about R objects to keep them on disk rather than letting the OS (which just knows about chunks of memory, but not what they represent).
这篇关于R中的高效内存管理的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!