Guava Cache,如何在删除时阻止访问 [英] Guava Cache, how to block access while doing removal

查看:231
本文介绍了Guava Cache,如何在删除时阻止访问的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有线程A,向Guava Cache插入一个新元素,并且由于Size策略,缓存将驱逐与键Y相关联的元素。



不幸的是, Y的删除过程需要很长时间,并且在Y处理期间(已经被驱逐但仍然在R中),还有另一个线程B试图获得与密钥Y相关联的数据。



基本上,R将尝试更新密钥Y的数据库,并且当该值未更新时,线程B尝试访问数据库以获取与密钥Y相关联的值,这仍然是旧值。



问题是:当R正在执行其工作时,如何阻止线程B使用键Y访问元素?

解决方案

你说过Guava Cache,但是没有代码示例,所以我给出了一般答案。



对于下面的我假设您有一个加载缓存又称自填充缓存模式。



解决方案1:
正确设计缓存teractions和数据库事务。



更新过程会在启动事务后立即使缓存条目无效。

 开始交易
用SQL UPDATE触摸一些条目数据,让它在交易中
从缓存中删除条目
....
现在你可以在数据库上对入口数据
做更多的操作,如果你有适当的隔离级别,从数据库读取将停止
,直到交易被提交
.. ..
结束交易

如果从缓存中删除条目然后启动交易你引入竞争条件。



解决方案2:
使用阻止同一键/条目上的并发操作的缓存。



查看 ehcache阻止缓存。或者查看 cache2k ,其中阻止行为是默认行为。



<但是,但是,您需要自己在加载程序级别上进行额外的锁定。例如。如下例所示。



解决方案3:
在缓存之上自行锁定并包装所有缓存操作。例如。类似于:

 缓存缓存; 
Lock [] locks = new Lock [16];
{/ * initialize locks * /}

public Object get(Object key){
int idx = key.hashCode()%locks.length;
locks [idx] .lock();
try {return cache.get(key);
} finally {locks [idx] .unlock(); }
}

public void update(Object key,Object obj){
int idx = key.hashCode()%locks.length;
locks [idx] .lock();
try {return cache.put(key,obj);
} finally {locks [idx] .unlock(); }
}

您还可以从ehcache查看BlockingCache实现并从中获取灵感那里。



玩得开心!


I have thread A, inserting a new element to Guava Cache, and because of the Size policy, the cache will evict element associated with key Y.

Unfortunately, the removal process R of Y takes long, and during the time Y is being process by R (already evicted but still in R), there is another thread B trying to get data associated with key Y.

Basically, R will try to update the database for the key Y, and while that value is not updated, thread B try to access the database for value associated with key Y, which is still the old value.

Question is: how can I block thread B from accessing element with key Y while R is doing its job?

解决方案

You stated Guava Cache, but there is no code example, so I give a general answer.

For the below I assume that you have a "loading cache" aka "self populating cache" schema.

Solution 1: Properly design your cache interactions and database transactions.

The update process invalidates the cache entry, as soon a transaction is started on it.

  begin transaction
  touch some of the entry data with SQL UPDATE to have it in the transaction
  remove the entry from the cache
  ....
  now you can do more operations on the database regarding the entry data
  if you have the proper isolation level, reads from the database will stall
  until the transaction is committed
  ....
  end transaction

If you remove the entry from the cache and then start the transaction you introduce a race condition.

Solution 2: Use caches that block out concurrent operations on the same key/entry.

Take a look on ehcache Blocking Cache. Or take a look on cache2k where the blocking behaviour is the default.

But, however, you need to do additional locking on the loader level by yourself. E.g. like the example below.

Solution 3: Do the locking by yourself on top of the cache and wrap all cache operations. E.g. with something like:

 Cache cache;
 Lock[] locks = new Lock[16];
 { /* initialize locks */ }

 public Object get(Object key) {
   int idx = key.hashCode() % locks.length;
   locks[idx].lock();
   try { return cache.get(key); 
   } finally { locks[idx].unlock(); }
 }

 public void update(Object key, Object obj) {
   int idx = key.hashCode() % locks.length;
   locks[idx].lock();
   try { return cache.put(key, obj); 
   } finally { locks[idx].unlock(); }
 }

You can also look at the BlockingCache implementation from ehcache and take some inspiration from there.

Have fun!

这篇关于Guava Cache,如何在删除时阻止访问的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆