使用gc()命令强制垃圾收集在R中运行 [英] Forcing garbage collection to run in R with the gc() command
问题描述
我应该在删除大对象后运行gc()以确保最大的内存可用性吗?
可能。我也是这样做的,并且通常甚至在
cleanMem < - function(n = 10){for(我在1:n)gc()}
然而,根据我的经验,到原始状态。
因此,我通常所做的是将任务保存在脚本文件中,并使用'r'前端执行(在Unix上以及'littler'包)。 Rscript是另一种操作系统的替代品。
该工作流正好符合
- 工作流程为统计分析和报告撰写
- 技巧 - 管理可用内存在r会话中我们在这里介绍的
之前。
Periodically I program sloppily. Ok, I program sloppily all the time, but sometimes that catches up with me in the form of out of memory errors. I start exercising a little discipline in deleting objects with the rm() command and things get better. I see mixed messages online about whether I should explicitly call gc() after deleting large data objects. Some say that before R returns a memory error it will run gc() while others say that manually forcing gc is a good idea.
Should I run gc() after deleting large objects in order to ensure maximum memory availability?
解决方案"Probably." I do it too, and often even in a loop as in
cleanMem <- function(n=10) { for (i in 1:n) gc() }
Yet that does not, in my experience, restore memory to a pristine state.
So what I usually do is to keep the tasks at hand in script files and execute those using the 'r' frontend (on Unix, and from the 'littler' package). Rscript is an alternative on that other OS.
That workflow happens to agree with
- workflow-for-statistical-analysis-and-report-writing
- tricks-to-manage-the-available-memory-in-an-r-session
which we covered here before.
这篇关于使用gc()命令强制垃圾收集在R中运行的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!