Tuesday 13 September 2016

JAVA In memory cache with time and size based eviction

In this blogpost we will see how we can create an in-memory cache with time and size based eviction. Time or time to live based eviction will check cache objects with last access time greater than a predefined value where as size based eviction will remove objects with LRU algorithm

Key for cache: 

public class CacheKey<T> {

private T key;

public CacheKey(T key) {
this.key = key;
}

public T getKey() {
return key;
}

public void setKey(T key) {
this.key = key;
}

}

Object as value for cache:

public class CachedObject<T> {

private long lastAccessedTime;

private T value;

public CachedObject(T value) {
this.lastAccessedTime = System.currentTimeMillis();
this.value = value;
}

public long getLastAccessedTime() {
return lastAccessedTime;
}

public void setLastAccessedTime(long lastAccessedTime) {
this.lastAccessedTime = lastAccessedTime;
}

public T getValue() {
return value;
}

public void setValue(T value) {
this.value = value;
}

LRU based cache storage :


public class LRUCache<K, V> {

private final int maxSize;

private ConcurrentHashMap<K, V> map;

private ConcurrentLinkedQueue<K> queue;

public LRUCache(final int maxSize) {
this.maxSize = maxSize;
map = new ConcurrentHashMap<K, V>(maxSize);
queue = new ConcurrentLinkedQueue<K>();
}

public void put(final K key, final V value) {
if (map.containsKey(key)) {
// remove the key from the FIFO queue
queue.remove(key);
}

while (queue.size() >= maxSize) {
K oldestKey = queue.poll();
System.out.println("Key to remove : " + oldestKey);
if (null != oldestKey) {
map.remove(oldestKey);
}
}
queue.add(key);
map.put(key, value);
}

public V get(final K key) {

if (map.containsKey(key)) {
// remove from queue and add it again in FIFO queue
queue.remove(key);
queue.add(key);
}
return map.get(key);
}

public void remove(final K key) {
if (map.containsKey(key)) {
// remove from queue and add it again in FIFO queue
queue.remove(key);
map.remove(key);
}
}

public ConcurrentHashMap<K, V> getMap() {
return map;
}

public void setMap(ConcurrentHashMap<K, V> map) {
this.map = map;
}

public ConcurrentLinkedQueue<K> getQueue() {
return queue;
}

public void setQueue(ConcurrentLinkedQueue<K> queue) {
this.queue = queue;
}

public int getMaxSize() {
return maxSize;
}

}

In memory cache implementation: 

public class InMemoryCache<K,V> {

private long timeToLive;

private LRUCache<CacheKey<?>, CachedObject<?>> cache;

public InMemoryCache(final long lifeTime, final long timerInterval, final int maxItems) {

this.timeToLive = lifeTime * 1000;

cache = new LRUCache<CacheKey<?>, CachedObject<?>>(maxItems);

if (timeToLive > 0 && timerInterval > 0) {

Thread t = new Thread(new Runnable() {
public void run() {
while (true) {
try {
Thread.sleep(timeToLive);
} catch (InterruptedException ex) {
}
cleanup();
}
}

private void cleanup() {

long now = System.currentTimeMillis();
List<CacheKey<?>> deleteKey = new ArrayList<CacheKey<?>>();

CacheKey<?> key = null;
CachedObject<?> cObject = null;

for (Map.Entry<CacheKey<?>, CachedObject<?>> entry : cache.getMap().entrySet()) {
key = entry.getKey();
cObject = entry.getValue();
if (cObject != null && (now > (timeToLive + cObject.getLastAccessedTime()))) {
deleteKey.add(key);
}
}

for (CacheKey<?> cacheKey : deleteKey) {
cache.remove(cacheKey);
Thread.yield();
}
}
});

t.setDaemon(true);
t.start();
}
}

public void put(CacheKey<?> key, CachedObject<?> value) {
cache.put(key, value);
}

public CachedObject<?> get(CacheKey<?> key) {

CachedObject<?> c = (CachedObject<?>) cache.get(key);

if (c == null)
return null;
else {
c.setLastAccessedTime(System.currentTimeMillis());
return c;
}
}

public void remove(CacheKey<?> key) {
cache.remove(key);
}

public int size() {
return cache.getMap().size();
}
}

So, in LRUCache storage implementation, once the predefined storage capacity is reached, it will check objects fetched earliest and remove that from storage. The key access order is maintained in a concurrent linked queue.

InMemoryCache keeps on running a background thread to check objects which has reached the time to live (idle time) and remove them from the cache.

Monday 12 September 2016

Implement LRU cache in Java

In this blogpost we will see how we can create a LRU cache using JAVA collection APIs.

LRU (Least Recently Used) cache discards least recently used element from the cache when we need to free up some space from cache. This algorithm requires keeping track of what was used when, which is expensive if one wants to make sure the algorithm always discards the least recently used item. General implementations of this technique require keeping "age bits" for cache-lines and track the "Least Recently Used" cache-line based on age-bits. In such an implementation, every time a cache-line is used, the age of all other cache-lines changes.

We will use ConcurrentHashMap as cache storage and ConcurrentLinkedQueue to keep track of element access order from the cache. Lets see how the code looks like - 

import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentLinkedQueue;

public class LRUCache<K, V> {

private final int maxSize;

private ConcurrentHashMap<K, V> map;

private ConcurrentLinkedQueue<K> queue;

public LRUCache(final int maxSize) {
this.maxSize = maxSize;
map = new ConcurrentHashMap<K, V>(maxSize);
queue = new ConcurrentLinkedQueue<K>();
}

public void put(final K key, final V value) {
if (map.containsKey(key)) {
// remove the key from the FIFO queue
queue.remove(key);
}

while (queue.size() >= maxSize) {
K oldestKey = queue.poll();
if (null != oldestKey) {
map.remove(oldestKey);
}
}
queue.add(key);
map.put(key, value);
}

public V get(final K key) {

if (map.containsKey(key)) {
// remove from queue and add it again in FIFO queue
queue.remove(key);
queue.add(key);
}
return map.get(key);
}
}

Lets see the main class 

public class ValidateLRUCache {

public static void main(String[] args) {
LRUCache<Integer, String> cache = new LRUCache<>(5);
cache.put(1, "A");
cache.put(2, "B");
cache.put(3, "C");
cache.put(4, "D");
cache.put(5, "E");
// key 5 moved ahead
System.out.println(cache.get(5));
// put new element to cache. this will evict key 1 from cache
cache.put(6, "F");
//this will print null
System.out.println(cache.get(1));

}

}

In this example we are using ConcurrentLinkedQueue to maintain the access order. The poll method retrieves and removes the head of this queue, or returns null if this queue is empty. So while putting elements to the cache storage if we exceed the size limit, least recently used keys will be removed from the queue and element for that key will be removed from the cache. Similarly if we fetch some element from cache we have to change the access order.