Solr 过滤缓存(FastLRUCache)占用太多内存导致内存不足? [英] Solr Filter Cache (FastLRUCache) takes too much memory and results in out of memory?

查看:49
本文介绍了Solr 过滤缓存(FastLRUCache)占用太多内存导致内存不足?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个 Solr 设置.一个主站和两个从站用于复制.我们在索引中有大约 7000 万个文档.从站有 16 GB 的 RAM.OS 和 HD 10GB,Solr 6GB.

I have a Solr setup. One master and two slaves for replication. We have about 70 Millions documents in index. The slaves have 16 GBs of RAM. 10GBs for the OS and HD, 6GBs for Solr.

但有时,奴隶会出现内存不足的情况.当我们在内存不足之前下载转储文件时,我们可以看到类:

But from time to time, the slaves are out of memory. When we downloaded the dump file just before it was out of memory, we could see that the class :

org.apache.solr.util.ConcurrentLRUCache$Stats @ 0x6eac8fb88

正在使用高达 5Gb 的内存.我们广泛使用过滤器缓存,它的命中率高达 93%.这是 solrconfig.xml 中过滤器缓存的 xml

is using up to 5Gb of memory. We are using filter caches extensively, it has a 93% hit ratio. And here's the xml for the filter cache in solrconfig.xml

<property name="filterCache.size" value="2000" />
<property name="filterCache.initialSize" value="1000" />
<property name="filterCache.autowarmCount" value="20" />

<filterCache class="solr.FastLRUCache"
             size="${filterCache.size}"
             initialSize="${filterCache.initialSize}"
             autowarmCount="${filterCache.autowarmCount}"/>

查询结果具有相同的设置,但使用的是 LRUCache,并且只使用了大约 35mb 的内存.是不是配置有问题需要修复,或者我只是需要更多内存用于过滤器缓存?

The query results have the same settings, but is using the LRUCache and it only uses about 35mb of the memory. Is there something wrong with the configuration which needs to be fixed, or do I just need more memory for the filter cache?

推荐答案

在朋友告诉我过滤器缓存的工作原理之后,我们就清楚为什么我们会时不时地出现内存不足错误.

After a friend told me how roughly the filter caches works, it become clear why we get out of memory errors from time to time.

那么过滤器缓存有什么作用呢?基本上它创建了一个类似于位数组的东西,它告诉哪些文档与过滤器匹配.一些类似的东西:

So what does the filter cache do? Basically it creates something like a bit array which tell which documents matched the filter. Some something like:

cache = [1, 0, 0, 1, .. 0]

1 表示命中,0 表示未命中.因此,对于示例,这意味着过滤器缓存匹配第 0 个和第 3 个文档.所以缓存有点像一个位数组,具有整个文档的长度.假设我有 5000 万个文档,那么数组长度将是 5000 万,这意味着一个过滤器缓存将在内存中占用 50.000.000 位.

1 means it hits, and 0 means no hit. So for the example, it means the filter cache matches the 0th and 3rd documents. So a cache is kind of like an array of bit, with the length of the total documents. So let's say I have 50 millions docs, so the array length will be 50 millions, which means one filter cache will take up 50.000.000 bit in the ram.

所以我们指定我们想要 2000 个过滤器缓存,这意味着它将占用的 RAM 大致是:

So we specified we want 2000 filter cache, it means the RAM it will take is roughly:

50.000.000 * 2000 = 100.000.000.000 bit 

如果您将其转换为 Gb.它将是:

If you convert it to Gb. It will be:

100.000.000.000 bit / 8 (to byte) / 1000 (to kb) / 1000 (to mb) / 1000 (to gb) = 12,5 Gb

因此,仅过滤器缓存所需的总 RAM 大约为 12Gb.这意味着如果 Solr 只有 6Gb 堆空间,它将无法创建 2000 个过滤器缓存.

So the total RAM needed just by the filter cache is roughly 12Gb. And it means if the Solr only have 6Gb Heap Space, it will not be able to create 2000 filter caches.

是的,我知道 Solr 并不总是创建这个数组,如果过滤器查询的结果很低,它可以创建其他占用较少内存的东西.这个计算只是粗略地说明了过滤器缓存的上限是多少,如果内存中有2000个缓存.在其他更好的情况下,它可以更低.

Yes, I know Solr doesn't always create this array, and if the result of the filter query is low, it can just create something else which take up less memory. This calculation just says roughly how much the upper limit of the filter cache is, if it has 2000 caches in the ram. It can be lower in other better cases.

因此,一种解决方案是降低 solr 配置中的最大过滤器缓存数.我们检查了solr stats,大部分时间我们只有大约600个过滤器缓存,因此我们可以将过滤器缓存数量减少到最大值.

So one solution is to lower the number of maximum filter caches in solr config. We checked solr stats, most of the time we only have about 600 filter caches, so we can reduce the filter caches number to that as the maximum.

另一种选择当然是添加更多 RAM.

Another option is to of course add more RAM.

这篇关于Solr 过滤缓存(FastLRUCache)占用太多内存导致内存不足?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆