java.lang.OutOfMemoryError:超出GC开销限制 [英] java.lang.OutOfMemoryError: GC overhead limit exceeded

查看:185
本文介绍了java.lang.OutOfMemoryError:超出GC开销限制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在一个程序中出现这个错误,该程序创建了几个(数十万个)每个有几个(15-20)个文本项的HashMap对象。这些字符串在提交到数据库之前已经被收集(没有分解成更小的数量)。



据Sun称,错误发生在如果时间太长被用于垃圾收集:如果超过98%的时间用于垃圾回收,并且只有不到2%的堆被恢复,则会抛出OutOfMemoryError。。



显然,可以使用命令行将参数传递给JVM以用于


  • 增加堆大小,通过-Xmx1024m (或更多)或

  • 通过-XX:-UseGCOverheadLimit完全禁用错误检查。



第一种方法工作正常,第二种方法在另一个java.lang.OutOfMemoryError中结束,这次是关于堆。



所以,question:is有没有任何程序化的替代方案,对于特定的用例(即几个小的HashMap对象)?例如,如果我使用HashMap clear()方法,问题就会消失,但存储在HashMap中的数据也会消失! : - )

这个问题也在

你基本上没有足够的内存来顺利运行这个过程。想到这些选项:


  1. 指定更多像上面提到的内存,尝试一些像 -Xmx512m first

  2. 如果可能的话,一次处理 HashMap 对象的小批量

  3. 如果您有很多重复的字符串,请使用 String.intern() ,然后将它们放入 HashMap

  4. 使用 HashMap(int initialCapacity,float loadFactor) 构造函数调整您的情况


I am getting this error in a program that creates several (hundreds of thousands) HashMap objects with a few (15-20) text entries each. These Strings have all to be collected (without breaking up into smaller amounts) before being submitted to a database.

According to Sun, the error happens "if too much time is being spent in garbage collection: if more than 98% of the total time is spent in garbage collection and less than 2% of the heap is recovered, an OutOfMemoryError will be thrown.".

Apparently, one could use the command line to pass arguments to the JVM for

  • Increasing the heap size, via "-Xmx1024m" (or more), or
  • Disabling the error check altogether, via "-XX:-UseGCOverheadLimit".

The first approach works fine, the second ends up in another java.lang.OutOfMemoryError, this time about the heap.

So, question: is there any programmatic alternative to this, for the particular use case (i.e., several small HashMap objects)? If I use the HashMap clear() method, for instance, the problem goes away, but so do the data stored in the HashMap! :-)

The issue is also discussed in a related topic in StackOverflow.

解决方案

You're essentially running out of memory to run the process smoothly. Options that come to mind:

  1. Specify more memory like you mentioned, try something in between like -Xmx512m first
  2. Work with smaller batches of HashMap objects to process at once if possible
  3. If you have a lot of duplicate strings, use String.intern() on them before putting them into the HashMap
  4. Use the HashMap(int initialCapacity, float loadFactor) constructor to tune for your case

这篇关于java.lang.OutOfMemoryError:超出GC开销限制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆