LRU LinkedHashMap,根据可用内存限制大小 [英] LRU LinkedHashMap that limits size based on available memory

查看:340
本文介绍了LRU LinkedHashMap,根据可用内存限制大小的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想创建一个LinkedHashMap,它会根据可用内存限制其大小(即当 freeMemory +(maxMemory - allocatedMemory)低于某个阈值时)。这将被用作一种缓存形式,可能使用最近最少使用作为缓存策略。



我的担心是分配的内存还包括(我认为)un - 垃圾收集的数据,因此会高估使用的内存量。我担心这可能会产生意想不到的后果。例如,LinkedHashMap可能会继续删除项目,因为它认为没有足够的可用内存,但免费内存不会增加,因为这些已删除的项目不会立即被垃圾收集。



有没有人有这种类型的东西的经验?我的担忧是否值得?如果是这样,任何人都可以提出一个好方法吗?

我应该补充一点,我也希望能够锁定缓存,基本上说好吧,从现在开始不要因为内存使用问题而删除任何东西。 我知道我有偏见,但我真的必须强烈推荐我们的 MapMaker 为此。使用softKeys()或softValues()功能,具体取决于它是否是键的GC集合,或者可以更清楚地描述条目何时可以清理的值的值。


I want to create a LinkedHashMap which will limit its size based on available memory (ie. when freeMemory + (maxMemory - allocatedMemory) gets below a certain threshold). This will be used as a form of cache, probably using "least recently used" as a caching strategy.

My concern though is that allocatedMemory also includes (I assume) un-garbage collected data, and thus will over-estimate the amount of used memory. I'm concerned about the unintended consequences this might have.

For example, the LinkedHashMap may keep deleting items because it thinks there isn't enough free memory, but the free memory doesn't increase because these deleted items aren't being garbage collected immediately.

Does anyone have any experience with this type of thing? Is my concern warranted? If so, can anyone suggest a good approach?

I should add that I also want to be able to "lock" the cache, basically saying "ok, from now on don't delete anything because of memory usage issues".

解决方案

I know I'm biased, but I really have to strongly recommend our MapMaker for this. Use the softKeys() or softValues() feature, depending on whether it's GC collection of the key or of the value that more aptly describes when an entry can be cleaned up.

这篇关于LRU LinkedHashMap,根据可用内存限制大小的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆