R的coreNLP :: initCoreNLP()引发java.lang.OutOfMemoryError [英] R's coreNLP::initCoreNLP() throws java.lang.OutOfMemoryError
问题描述
coreNLP
是用于与Standford的CoreNLP Java库进行接口的R包.必须执行的第一行(在使用library()
命令加载适当的软件包之后)是initCoreNLP()
.不幸的是,这导致以下错误:
coreNLP
is an R package for interfacing with Standford's CoreNLP Java libraries. The first line one must execute (after loading the appropriate packages with the library()
command) is initCoreNLP()
. Unfortunately, this results in the following error:
从edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz中加载分类器... rJava中的错误::.jnew("edu.stanford.nlp.pipeline.StanfordCoreNLP ,基本名称(路径)): java.lang.OutOfMemoryError:超出了GC开销限制
Loading classifier from edu/stanford/nlp/models/ner/english.conll.4class.distsim.crf.ser.gz ... Error in rJava::.jnew("edu.stanford.nlp.pipeline.StanfordCoreNLP", basename(path)) : java.lang.OutOfMemoryError: GC overhead limit exceeded
Note, this is the same problem that is listed here: (initCoreNLP() method call from the Stanford's R coreNLP package throws error). In that case, however, the OP found that rebooting his machine made the problem disappear. This is not the case for me; I keep experiencing it even after a reboot.
还有其他人遇到这个问题并且可以提供解决方案或建议吗?
Has anyone else run into this and can provide a solution or suggestion?
预先感谢, 危险品
R版本3.2.3(2015-12-10)
R version 3.2.3 (2015-12-10)
rJava版本0.9-7
rJava version 0.9-7
coreNLP版本0.4-1
coreNLP version 0.4-1
机器:拥有8GB RAM的Win 7
Machine: Win 7 with 8GB RAM
推荐答案
以下是一些我发现的文档:
Here is some documentation I found:
https://cran.r-project.org/web/包/coreNLP/coreNLP.pdf
(特别是第7页)
您可以指定使用多少内存(来自文档):
You can specify how much memory you use (from the documentation):
initCoreNLP(libLoc, parameterFile, mem = "4g", annotators)
增加内存,我想问题将会消失.
Add more memory and I would imagine the problem will go away.
这篇关于R的coreNLP :: initCoreNLP()引发java.lang.OutOfMemoryError的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!