1

Problem: Define a fixed-size LRU cache that supports amortized O(1) operations on objects of type E. Each object has a unique id. This question contains a problem description and several solutions: LRU cache in Java with Generics and O(1) operations

Possible Solution that does not work: Create only one linked list and only one HashMap. Maintain LinkedList and HashMap<E.id, LinkedList>. The hashmap contains the address of each node in the linkedlist.

Deleting an entry from the cache can be done by:

node = hashmap.get(elem.id);
node.prev.next = node.next;
node.next.prev = node.prev;
hashmap.remove(elem.id)

(ignoring corner cases here for brevity)

The problem here is that Java's LinkedList class does not seem to expose individual nodes in the list. So I can't store their addresses in the hashmap. Is there a workaround for this? Most existing solutions implement their own DoublyLinkedList class. Can I use/extend an existing class in Java Collections Framework instead?

  • 1
    You probably won't be able to do this without _building your own_ linked list and possibly a hash map too. Or you could use the `LinkedHashMap` [constructor](https://stackoverflow.com/q/27475797/869736) specifically for this. – Louis Wasserman Jan 16 '21 at 04:21

2 Answers2

2

For an LRU cache, the simple solution is just to use a LinkedHashMap and create it with accessOrder set to true. That tells the map to order the map entries by last access time rather than insertion time. The javadoc for the class explains in more detail what this does.

If you want to implement automatic cache removal, extend LinkedHashMap and override the removeEldestEntry to implement your preferred removal policy. The javadoc for the method has an example that limits the cache size to a fixed number of entries.

You didn't ask, but the complexity of an LRU cache implemented this ways is the same as for a LinkedHashMap generally. It will be O(1) (amortized) for entry insertion, lookup and removal.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
  • Let's say that every element in the Cache has a fixed priority, which is stored as an attribute. I want to remove the element with the lowest priority when the cache is full. How can I do that by overriding protected boolean removeEldestEntry​(Map.Entry eldest), which takes only one parameter? Does this change the expected runtime complexity of the solution? – Apoorv Gupta Jan 16 '21 at 17:08
  • Well that is a **different** problem. In a LRU cache, you remove the least recently used cache entry. If you are removing elements according to priority, then it is not an LRU cache. LinkedHashMap only works for LRU and "oldest" by insertion time caches. (But, yes, it changes the expected runtime complexity of any solution.) – Stephen C Jan 17 '21 at 01:45
  • I agree. Thanks. – Apoorv Gupta Jan 17 '21 at 06:32
0

Here's a solution that implements only the get() method in an LRU cache in amortized O(1) time. It can also implement add(E) and removeOldest() operations in O(1) time. The only drawback is that it creates empty nodes in the linked list, which waste memory.

class LRUCache<E> {
  public E get(String id) {
    if (!addresses.containsKey(id)) return null;
    List<E> valList = addresses.get(id);
    E val = valList.get(0);
    valList.clear();
    List<E> newList = List.of(val);
    list.add(newList);  // O(1) time
    addresses.put(id, newlist);
    return val;
  }

  HashMap<String, List<E>> addresses;
  LinkedList<List<E>> list;
}