Easy, Simple to Use Lru Cache in Java

Easy, simple to use LRU cache in java

You can use a LinkedHashMap (Java 1.4+) :

// Create cache
final int MAX_ENTRIES = 100;
Map cache = new LinkedHashMap(MAX_ENTRIES+1, .75F, true) {
// This method is called just after a new entry has been added
public boolean removeEldestEntry(Map.Entry eldest) {
return size() > MAX_ENTRIES;
}
};

// Add to cache
Object key = "key";
cache.put(key, object);

// Get object
Object o = cache.get(key);
if (o == null && !cache.containsKey(key)) {
// Object not in cache. If null is not a possible value in the cache,
// the call to cache.contains(key) is not needed
}

// If the cache is to be used by multiple threads,
// the cache must be wrapped with code to synchronize the methods
cache = (Map)Collections.synchronizedMap(cache);

How would you implement an LRU cache in Java?

If I were doing this again from scratch today, I'd use Guava's CacheBuilder.

LRU cache in Java with Generics and O(1) operations

From the question itself, we can see that the problem of O(n) operations arises when querying the linked list. Therefore, we need an alternative data structure. We need to be able to update the items' last access time from the HashMap without searching.

We can keep two separate data structures. A HashMap with (Key,Pointer) pairs and a doubly linked list which will work as the priority queue for deletion and store the Values. From the HashMap, we can point to an element in the doubly linked list and update its' retrieval time. Because we go directly from the HashMap to the item in the list, our time complexity remains at O(1)

For example, our doubly linked list can look like:

least_recently_used  -> A <-> B <-> C <-> D <-> E <- most_recently_used

We need to keep a pointer to the LRU and MRU items. The entries' values will be stored in the list and when we query the HashMap, we will get a pointer to the list. On get(), we need to put the item at the right-most side of the list. On put(key,value), if the cache is full, we need to remove the item at the left-most side of the list from both, the list and the HashMap.

The following is an example implementation in Java:

public class LRUCache<K, V>{

// Define Node with pointers to the previous and next items and a key, value pair
class Node<T, U> {
Node<T, U> previous;
Node<T, U> next;
T key;
U value;

public Node(Node<T, U> previous, Node<T, U> next, T key, U value){
this.previous = previous;
this.next = next;
this.key = key;
this.value = value;
}
}

private HashMap<K, Node<K, V>> cache;
private Node<K, V> leastRecentlyUsed;
private Node<K, V> mostRecentlyUsed;
private int maxSize;
private int currentSize;

public LRUCache(int maxSize){
this.maxSize = maxSize;
this.currentSize = 0;
leastRecentlyUsed = new Node<K, V>(null, null, null, null);
mostRecentlyUsed = leastRecentlyUsed;
cache = new HashMap<K, Node<K, V>>();
}

public V get(K key){
Node<K, V> tempNode = cache.get(key);
if (tempNode == null){
return null;
}
// If MRU leave the list as it is
else if (tempNode.key == mostRecentlyUsed.key){
return mostRecentlyUsed.value;
}

// Get the next and previous nodes
Node<K, V> nextNode = tempNode.next;
Node<K, V> previousNode = tempNode.previous;

// If at the left-most, we update LRU
if (tempNode.key == leastRecentlyUsed.key){
nextNode.previous = null;
leastRecentlyUsed = nextNode;
}

// If we are in the middle, we need to update the items before and after our item
else if (tempNode.key != mostRecentlyUsed.key){
previousNode.next = nextNode;
nextNode.previous = previousNode;
}

// Finally move our item to the MRU
tempNode.previous = mostRecentlyUsed;
mostRecentlyUsed.next = tempNode;
mostRecentlyUsed = tempNode;
mostRecentlyUsed.next = null;

return tempNode.value;

}

public void put(K key, V value){
if (cache.containsKey(key)){
return;
}

// Put the new node at the right-most end of the linked-list
Node<K, V> myNode = new Node<K, V>(mostRecentlyUsed, null, key, value);
mostRecentlyUsed.next = myNode;
cache.put(key, myNode);
mostRecentlyUsed = myNode;

// Delete the left-most entry and update the LRU pointer
if (currentSize == maxSize){
cache.remove(leastRecentlyUsed.key);
leastRecentlyUsed = leastRecentlyUsed.next;
leastRecentlyUsed.previous = null;
}

// Update cache size, for the first added entry update the LRU pointer
else if (currentSize < maxSize){
if (currentSize == 0){
leastRecentlyUsed = myNode;
}
currentSize++;
}
}
}

Best way to implement LRU cache

If you want an LRU cache, the simplest in Java is LinkedHashMap. The default behaviour is FIFO however you can changes it to "access order" which makes it an LRU cache.

public static <K,V> Map<K,V> lruCache(final int maxSize) {
return new LinkedHashMap<K, V>(maxSize*4/3, 0.75f, true) {
@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return size() > maxSize;
}
};
}

Note: I have using the constructor which changes the collection from newest first to most recently used first.

From the Javadoc

public LinkedHashMap(int initialCapacity,
float loadFactor,
boolean accessOrder)
Constructs an empty LinkedHashMap instance with the specified initial capacity, load factor and ordering mode.
Parameters:
initialCapacity - the initial capacity
loadFactor - the load factor
accessOrder - the ordering mode - true for access-order, false for insertion-order

When accessOrder is true the LinkedHashMap re-arranges the order of the map whenever you get() an entry which is not the last one.

This way the oldest entry is the least which is recently used.

Implementing an LRU cache with built in data structures

You can use Java LinkedHashMap and override removeEldestEntry to implement LRU cache

public class SimpleLru<K, V> extends LinkedHashMap<K, V>{

final int cacheSize;

public SimpleLru(int cacheSize) {
this.cacheSize = cacheSize;
}

@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return this.size() > this.cacheSize;
}

public static void main(String[] args) {
SimpleLru<String, Integer> m = new SimpleLru<>(2); // Max 2
m.put("k1", 1); // k1:1
m.put("k2", 2); // k1:1, k2:2
m.put("k3", 3); // k2:2, k3:3
}
}

If you want to have a thread-safe version, you can use:

public class ConcurrentLru<K, V> {

final Object mutex = new Object();
final Map<K, V> cache;

public ConcurrentLru(final int cacheSize) {
this.cache = new LinkedHashMap<K, V>() {

@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return this.size() > cacheSize;
}
};
}

public void put(K k, V v) {
synchronized (this.mutex) { this.cache.put(k, v); }
}

public boolean contains(K k) {
synchronized (this.mutex) { return this.cache.containsKey(k); }
}

public V remove(K k) {
synchronized (this.mutex) { return this.cache.remove(k); }
}

public V get(K k) {
synchronized (this.mutex) { return this.cache.get(k); }
}
}

Use LinkedHashMap to implement LRU cache

As pointed out by Jeffrey, you are using accessOrder. When you created the LinkedHashMap, the third parameter specify how the order is changed.

"true for access-order, false for insertion-order"

For more detailed implementation of LRU, you can look at this
http://www.programcreek.com/2013/03/leetcode-lru-cache-java/

Using FIFO vs. LRU on real examples

The important thing to realize here is that caching policies really do vary case by case.
The same policy that works for Netflix's movie streaming service may not be the same for YouTube's.

Having said that, I will try to answer and state my assumptions for each case.

Watching a movie: FIFO caching would be a good choice because the task at hand is a linear event. As you watch a film, you are more likely to rewind say 10s than you are to restart the film entirely. Therefore, using a FIFO caching technique would be useful here.

Monthly accrual of interest to savings accounts of all users of the bank: I would use a LRU policy here because some accounts may be accruing interest more consistently than others. Thus, a FIFO policy would be naive to the fact that while X account does have interest this month, Y account has been accruing interest for the past 10 months.

Running a video game with lots of graphic elements: A LRU caching policy would work best here because the elements may be used more frequently in certain parts of the video game than others.

Searching a specific value in a table: This one really is too broad to say... Really depends on the usage of the table.

Surfing on a website: I would suggest a LRU policy if you are someone who only visits a few sites regularly. However if you are someone who bounces around from site to site, a FIFO policy may be helpful.

Hope this helps!



Related Topics



Leave a reply



Submit