当为50,000个对象插入HashMap时,为什么会出现OutOfMemoryError? [英] Why do I get an OutOfMemoryError when inserting 50,000 objects into HashMap?

查看:189
本文介绍了当为50,000个对象插入HashMap时,为什么会出现OutOfMemoryError?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我试图在一个 java.util.HashMap< java.awt.Point,Segment> 中插入大约50,000个对象(并因此插入50,000个键)。但是,我不断收到OutOfMemory异常。 ( Segment 是我自己的类 - 非常轻 - 一个字符串字段,3 int 字段)。

线程main中的异常java.lang.OutOfMemoryError:Java堆空间
at java。 util.HashMap.resize(HashMap.java:508)$ b $ java.util.HashMap.addEntry(HashMap.java:799)
java.util.HashMap.put(HashMap.java:431)
at bus.tools.UpdateMap.putSegment(UpdateMap.java:168)

这看起来很荒谬,因为我看到有很多内存在机器上可用 - 在免费的RAM和HD空间中用于虚拟内存。

Java是否可能在一些严格的内存要求下运行?我可以增加这些吗?

HashMap 有一些奇怪的限制吗?我将不得不执行我自己的?是否还有其他类值得关注?



(我在运行带有2GB内存的英特尔机器上的OS X 10.5下运行Java 5)。$ b $您可以通过向java传递-Xmx128m(其中128是兆字节数)来增加最大堆大小。我不记得默认的大小,但它让我觉得这是一个很小的东西。



你可以通过使用 Runtime class。

  //获取堆的当前大小(以字节为单位)
long heapSize = Runtime.getRuntime()。totalMemory();

//以字节为单位获取堆的最大大小。堆不能超过这个尺寸。
//任何尝试都将导致OutOfMemoryException。
long heapMaxSize = Runtime.getRuntime()。maxMemory();

//以字节为单位获取堆内可用内存量。这个大小在垃圾回收后会增加
//并且在创建新对象时会减少。
long heapFreeSize = Runtime.getRuntime()。freeMemory();

(来自 Java Developers Almanac



这也部分在有关Java HotSpot VM的常见问题解答以及 Java 6 GC Tuning页面


I am trying to insert about 50,000 objects (and therefore 50,000 keys) into a java.util.HashMap<java.awt.Point, Segment>. However, I keep getting an OutOfMemory exception. (Segment is my own class - very light weight - one String field, and 3 int fields).

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
    at java.util.HashMap.resize(HashMap.java:508)
    at java.util.HashMap.addEntry(HashMap.java:799)
    at java.util.HashMap.put(HashMap.java:431)
    at bus.tools.UpdateMap.putSegment(UpdateMap.java:168)

This seems quite ridiculous since I see that there is plenty of memory available on the machine - both in free RAM and HD space for virtual memory.

Is it possible Java is running with some stringent memory requirements? Can I increase these?

Is there some weird limitation with HashMap? Am I going to have to implement my own? Are there any other classes worth looking at?

(I am running Java 5 under OS X 10.5 on an Intel machine with 2GB RAM.)

解决方案

You can increase the maximum heap size by passing -Xmx128m (where 128 is the number of megabytes) to java. I can't remember the default size, but it strikes me that it was something rather small.

You can programmatically check how much memory is available by using the Runtime class.

// Get current size of heap in bytes
long heapSize = Runtime.getRuntime().totalMemory();

// Get maximum size of heap in bytes. The heap cannot grow beyond this size.
// Any attempt will result in an OutOfMemoryException.
long heapMaxSize = Runtime.getRuntime().maxMemory();

// Get amount of free memory within the heap in bytes. This size will increase
// after garbage collection and decrease as new objects are created.
long heapFreeSize = Runtime.getRuntime().freeMemory();

(Example from Java Developers Almanac)

This is also partially addressed in Frequently Asked Questions About the Java HotSpot VM, and in the Java 6 GC Tuning page.

这篇关于当为50,000个对象插入HashMap时,为什么会出现OutOfMemoryError?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
相关文章
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆