处理大型数据集时出现内存不足错误 [英] Outof memory error while working on large dataset
问题描述
我正在LSI上运行代码,这需要首先从数据库中获取大量数据.它适用于小型数据集.随着我增加数据集,它给了我以下错误.
I am running a code on LSI, which requires first fetching a lot of data from database. It is working fine for small data-set. As, i increase the data-set, it gives me the following error.
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
我当前正在具有2 GB RAM的系统上运行代码.错误是与RAM容量有关还是由于其他原因引起的.
I am currently running the code on system having 2 GB of RAM. Is the error related to RAM capacity or due to something else.
谢谢!
推荐答案
运行Java时,您必须针对特定问题传递VM参数. 您需要增加堆值:
When you run Java you'll have to pass VM parameters for your specific concerns. You need to increase heap values:
-Xms40m - minimum heap size in MB
-Xmx1024m - maximum heap size in MB
java test.java -Xms40m -Xmx1024m
java test.java -Xms40m -Xmx1024m
启动应用程序时.有关更多信息,请参考 Oracle文档.或者,如果您使用Eclipse,请在ecliplse.ini
文件中增加此大小.
when launching your app. For more information refer Oracle documentation. Or if you use Eclipse increase this size in ecliplse.ini
file.
这篇关于处理大型数据集时出现内存不足错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!