具有泛型和O(1)操作的Java中的LRU缓存 [英] LRU cache in Java with Generics and O(1) operations

查看:122
本文介绍了具有泛型和O(1)操作的Java中的LRU缓存的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

这是在面试中出现的一个问题。这个想法是定义一个数据结构,而不是使用Java内置的LinkedHashMap。



LRU缓存删除最近最少使用的条目来插入新的一个。
因此,考虑到以下情况:

  A  -  B  -  C  -  D  -  E 

A是最近最少使用的项目,如果我们插入F,我们需要删除A。



如果我们使用(key,value)保存具有缓存条目的HashMap以及包含元素的关键字和使用时间的单独列表,这可以轻松实现。但是,我们需要查询列表来找到最近最少使用的项目,具有潜在的O(n)时间复杂度。



如何在Java中实现此结构对于通用对象和O(1)操作?



这与可能的重复不同,它侧重于效率(O(1)ops)和实现数据结构本身,不扩展Java的。

解决方案

从问题本身,我们可以看到O(n)操作的问题出现在查询链表。因此,我们需要一个替代的数据结构。我们需要能够从HashMap更新项目的最后访问时间,而无需搜索。



我们可以保留两个单独的数据结构。具有(键,指针)的 HashMap和双向链表,它们将作为删除和存储值的优先级队列。从HashMap,我们可以指向双向链表中的一个元素,并更新其检索时间。因为我们直接从HashMap到列表中的项目,所以我们的时间复杂度保持在O(1)



例如,我们的双向链表可能如下所示: / p>

  least_recently_used  - > A  - B  - C  - D  - 我们需要保留一个指向LRU和MRU项目的指针。条目值将存储在列表中,当我们查询HashMap时,我们将获得一个指向列表的指针。在get()中,我们需要将该项目放在列表最右侧。在put(key,value)上,如果缓存已满,我们需要从列表和HashMap中删除列表最左侧的项。



以下是Java中的示例实现:

  public class LRUCache< K,V> {

//定义具有上一个和下一个项目的指针的节点和一个键值
class Node< T,U> {
Node< T,U>以前;
Node< T,U>下一个;
T键;
U值;

public Node(Node< T,U> previous,Node< T,U> next,T key,U value){
this.previous = previous;
this.next = next;
this.key = key;
this.value = value;
}
}

private HashMap< K,Node< K,V>>缓存
私人节点< K,V>最少使用;
私人节点< K,V>最近用
private int maxSize;
private int currentSize;

public LRUCache(int maxSize){
this.maxSize = maxSize;
this.currentSize = 0;
leastRecentlyUsed = new Node< K,V>(null,null,null,null);
mostRecentlyUsed = leastRecentlyUsed;
cache = new HashMap< K,Node< K,V>();
}

public V get(K key){
Node< K,V> tempNode = cache.get(key);
if(tempNode == null){
return null;
}
//如果MRU离开列表,就像
else if(tempNode.key == mostRecentlyUsed.key){
return mostRecentlyUsed.value;
}

//获取下一个和前一个节点
Node< K,V> nextNode = tempNode.next;
Node< K,V> previousNode = tempNode.previous;

//如果最左边,我们更新LRU
if(tempNode.key == leastRecentlyUsed.key){
nextNode.previous = null;
leastRecentlyUsed = nextNode;
}

//如果我们在中间,我们需要更新项目前后项目
else if(tempNode.key!= mostRecentlyUsed.key){
previousNode.next = nextNode;
nextNode.previous = previousNode;
}

//最后将我们的项目移动到MRU
tempNode.previous = mostRecentlyUsed;
mostRecentlyUsed.next = tempNode;
mostRecentlyUsed = tempNode;
mostRecentlyUsed.next = null;

return tempNode.value;

}

public void put(K key,V value){
if(cache.containsKey(key)){
return;
}

//将新节点放在链表的最右端
Node< K,V> myNode = new Node< K,V>(mostRecentlyUsed,null,key,value);
mostRecentlyUsed.next = myNode;
cache.put(key,myNode);
mostRecentlyUsed = myNode;

//删除最左边的条目并更新LRU指针
if(currentSize == maxSize){
cache.remove(leastRecentlyUsed.key);
leastRecentlyUsed = leastRecentlyUsed.next;
leastRecentlyUsed.previous = null;
}

//更新缓存大小,对于第一个添加的条目更新LRU指针
else if(currentSize< maxSize){
if(currentSize == 0){
leastRecentlyUsed = myNode;
}
currentSize ++;
}
}
}


This is a question that comes up a lot in job interviews. The idea is to define a data structure instead of using Java's built in LinkedHashMap.

An LRU cache deletes the least recently used entry to insert a new one. So, given the following scenario:

 A - B - C - D - E

Where A is the least recently used item, if we were to insert F, we need to remove A.

This can be easily implemented if we keep a HashMap with the cache entries by (key,value) and a separate list that contains the elements' key and time of use. However, we would need to query the list to find the least recently used item, with a potential O(n) time complexity.

How can this structure be implemented in Java for Generic objects and O(1) operations?

This is different from the possible duplicate in that it focuses on efficiency (O(1) ops) and implementing the data structure itself, not extending Java's.

解决方案

From the question itself, we can see that the problem of O(n) operations arises when querying the linked list. Therefore, we need an alternative data structure. We need to be able to update the items' last access time from the HashMap without searching.

We can keep two separate data structures. A HashMap with (Key,Pointer) pairs and a doubly linked list which will work as the priority queue for deletion and store the Values. From the HashMap, we can point to an element in the doubly linked list and update its' retrieval time. Because we go directly from the HashMap to the item in the list, our time complexity remains at O(1)

For example, our doubly linked list can look like:

least_recently_used  -> A <-> B <-> C <-> D <-> E <- most_recently_used

We need to keep a pointer to the LRU and MRU items. The entries' values will be stored in the list and when we we query the HashMap, we will get a pointer to the list. On get(), we need to put the item at the right-most side of the list. On put(key,value), if the cache is full, we need to remove the item at the left-most side of the list from both the list and the HashMap.

The following is an example implementation in Java:

public class LRUCache<K, V>{

    // Define Node with pointers to the previous and next items and a key, value pair
    class Node<T, U> {
        Node<T, U> previous;
        Node<T, U> next;
        T key;
        U value;

        public Node(Node<T, U> previous, Node<T, U> next, T key, U value){
            this.previous = previous;
            this.next = next;
            this.key = key;
            this.value = value;
        }
    }

    private HashMap<K, Node<K, V>> cache;
    private Node<K, V> leastRecentlyUsed;
    private Node<K, V> mostRecentlyUsed;
    private int maxSize;
    private int currentSize;

    public LRUCache(int maxSize){
        this.maxSize = maxSize;
        this.currentSize = 0;
        leastRecentlyUsed = new Node<K, V>(null, null, null, null);
        mostRecentlyUsed = leastRecentlyUsed;
        cache = new HashMap<K, Node<K, V>>();
    }

    public V get(K key){
        Node<K, V> tempNode = cache.get(key);
        if (tempNode == null){
            return null;
        }
        // If MRU leave the list as it is
        else if (tempNode.key == mostRecentlyUsed.key){
            return mostRecentlyUsed.value;
        }

        // Get the next and previous nodes
        Node<K, V> nextNode = tempNode.next;
        Node<K, V> previousNode = tempNode.previous;

        // If at the left-most, we update LRU 
        if (tempNode.key == leastRecentlyUsed.key){
            nextNode.previous = null;
            leastRecentlyUsed = nextNode;
        }

        // If we are in the middle, we need to update the items before and after our item
        else if (tempNode.key != mostRecentlyUsed.key){
            previousNode.next = nextNode;
            nextNode.previous = previousNode;
        }

        // Finally move our item to the MRU
        tempNode.previous = mostRecentlyUsed;
        mostRecentlyUsed.next = tempNode;
        mostRecentlyUsed = tempNode;
        mostRecentlyUsed.next = null;

        return tempNode.value;

    }

    public void put(K key, V value){
        if (cache.containsKey(key)){
            return;
        }

        // Put the new node at the right-most end of the linked-list
        Node<K, V> myNode = new Node<K, V>(mostRecentlyUsed, null, key, value);
        mostRecentlyUsed.next = myNode;
        cache.put(key, myNode);
        mostRecentlyUsed = myNode;

        // Delete the left-most entry and update the LRU pointer
        if (currentSize == maxSize){
            cache.remove(leastRecentlyUsed.key);
            leastRecentlyUsed = leastRecentlyUsed.next;
            leastRecentlyUsed.previous = null;
        }

        // Update cache size, for the first added entry update the LRU pointer
        else if (currentSize < maxSize){
            if (currentSize == 0){
                leastRecentlyUsed = myNode;
            }
            currentSize++;
        }
    }
}

这篇关于具有泛型和O(1)操作的Java中的LRU缓存的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆