在Linux下限制R中的内存使用 [英] limiting memory usage in R under linux

查看:1027
本文介绍了在Linux下限制R中的内存使用的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我们在Linux集群环境中运行R.当用户无意间使用R进程占用了所有内存时,头节点已挂起一些.有没有一种方法可以限制Linux下的R内存使用?我宁愿不建议使用全局限制,但这可能是唯一的方法.

We are running R in a linux cluster environment. The head node has had a few hangs when a user has inadvertently taken all the memory using an R process. Is there a way to limit R memory usage under linux? I'd rather not suggest global ulimits, but that may be the only way forward.

推荐答案

我创建了一个小的R包, ulimit ,它允许使用相同的机制为运行的R进程设置内存限制也用于shell中的ulimit.当前,该程序包在Windows上不起作用-如果运行Windows,请使用utils程序包中的memory.limit(). 编辑:在其他" POSIX平台上也无法使用- ulimit -v无效在OS X上 ...

I have created a small R package, ulimit, that allows setting memory limits for a running R process using the same mechanism that is also used for ulimit in the shell. Currently the package doesn't work on Windows -- use memory.limit() from the utils package if you run Windows. EDIT: It also doesn't work on the "other" POSIX platform -- ulimit -v has no effect on OS X...

使用

devtools::install_github("krlmlr/ulimit")

要将R的可用内存限制为2000 MiB,只需调用:

To limit the memory available to R to 2000 MiB, simply call:

ulimit::memory_limit(2000)

现在:

> rep(0L, 1e9)
Error: cannot allocate vector of size 3.7 Gb

该程序包可以使用,但是还处于早期阶段. Windows支持已已计划,但尚未实施.反馈非常感谢!

The package is functional, but in a very early stage. Support for Windows is planned but not implemented yet. Feedback is greatly appreciated!

这篇关于在Linux下限制R中的内存使用的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆