Monday, 12 September 2016

Implement LRU cache in Java

In this blogpost we will see how we can create a LRU cache using JAVA collection APIs.

LRU (Least Recently Used) cache discards least recently used element from the cache when we need to free up some space from cache. This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item. General implementations of this technique require keeping "age bits" for cache-lines and track the "Least Recently Used" cache-line based on age-bits. In such an implementation, every time a cache-line is used, the age of all other cache-lines changes.

We will use ConcurrentHashMap as cache storage and ConcurrentLinkedQueue to keep track of element access order from the cache. Lets see how the code looks like - 

import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentLinkedQueue;

public class LRUCache<K, V> {

private final int maxSize;

private ConcurrentHashMap<K, V> map;

private ConcurrentLinkedQueue<K> queue;

public LRUCache(final int maxSize) {
this.maxSize = maxSize;
map = new ConcurrentHashMap<K, V>(maxSize);
queue = new ConcurrentLinkedQueue<K>();
}

public void put(final K key, final V value) {
if (map.containsKey(key)) {
// remove the key from the FIFO queue
queue.remove(key);
}

while (queue.size() >= maxSize) {
K oldestKey = queue.poll();
if (null != oldestKey) {
map.remove(oldestKey);
}
}
queue.add(key);
map.put(key, value);
}

public V get(final K key) {

if (map.containsKey(key)) {
// remove from queue and add it again in FIFO queue
queue.remove(key);
queue.add(key);
}
return map.get(key);
}
}

Lets see the main class 

public class ValidateLRUCache {

public static void main(String[] args) {
LRUCache<Integer, String> cache = new LRUCache<>(5);
cache.put(1, "A");
cache.put(2, "B");
cache.put(3, "C");
cache.put(4, "D");
cache.put(5, "E");
// key 5 moved ahead
System.out.println(cache.get(5));
// put new element to cache. this will evict key 1 from cache
cache.put(6, "F");
//this will print null
System.out.println(cache.get(1));

}

}

In this example we are using ConcurrentLinkedQueue to maintain the access order. The poll method retrieves and removes the head of this queue, or returns null if this queue is empty. So while putting elements to the cache storage if we exceed the size limit, least recently used keys will be removed from the queue and element for that key will be removed from the cache. Similarly if we fetch some element from cache we have to change the access order. 

No comments:

Post a Comment