-1

I have developed an LRU cache in java now please advise I have to customize it as per the multi threaded revilement so please advise what changes I need to to in my below program to make it safe for multi threaded environment , below is my code ..

import java.util.LinkedHashMap;

public class LRUCache extends LinkedHashMap<Integer, String> {
    private static final long serialVersionUID = 1342L;

    private int cacheSize;  

    //In overridden method, we are saying that, remove entry only when we have reached cacheSize.
    //initialCapacity,loadFactor,accessOrder the ordering mode - true for
    // access-order, false for insertion-order

    public LRUCache(int size) {     
        super(size, 0.75f, true);
        this.cacheSize = size;
    }

    @Override
    // removeEldestEntry() should be overridden by the user, otherwise it will not 
    //remove the oldest object from the Map.
    protected boolean removeEldestEntry(java.util.Map.Entry<Integer,String> eldest) {
                return size() > cacheSize;
    }

     public static void main(String arg[]){

         LRUCache lruCache = new LRUCache(2);
         lruCache.put(1, "Object1");
          lruCache.put(2, "Object2");
          lruCache.put(3, "Object3");
          System.out.println(lruCache);
     }


} 

1 Answers1

1

You have a very good implementation of cache (with LRU possibility) in Google Caches.

In the case of your implementation the first inconvenience is that LinkedHashMap is NOT thread safe.

The simplest solution is to make it thread safe using java.util.Collections.synchronizedMap(yourMap) but it is not the most efficient one.

A second problem on your solution is that it is not updating the use of keys when they are accessed (to be consistent with Least Recently Used policy).

For example if you do:

LRUCache lruCache = new LRUCache(2);
lruCache.put(1, "Object1");
lruCache.put(2, "Object2");
lruCache.get(1);              // Here value 1 is "used"
lruCache.put(3, "Object3");
System.out.println(lruCache);

The LRU element is two, but 1 will be removed when 3 is added. The class is implementing First In First Out policy (FIFO).

I suggest to create a new class (or better an interface) for the LRUCache without extending a Map and using a ConcurrentHashMap internally to store and find the values by key. You need to store in the values the original value with additional data to keep the insertion order. The trick is every time you do a get(), you remove and add the same value again (to update the last access order).

Similar to your problem:

Community
  • 1
  • 1
Carlos Fau
  • 51
  • 1
  • 5