2

I am working on a web-based medical application and need to create a small in-memory object cache. Here is my use-case.

We need to show list of requests submitted by people who need certain things (Blood, Kidney, etc.) and it's not going to be a huge list as in a given day request for blood or anything else will be a limited one. Please take into account that we do not want to use any caching API as it would be an overkill. The idea is to create a Map and place it in the ApplicationContext.

The moment a new request is being placed by any person, we will update that Map in the Application context and the moment the request expires, we will remove them from the Map. We need to look into the following points additionally.

  1. Need to set Max Element Limit.
  2. If Max Limit reached, we should removed entry which was added first.
  3. Take care of any Synchronized issues.

Please suggest what Data-structure should be used and what things to take care of while implementing this.

Marko Topolnik
  • 195,646
  • 29
  • 319
  • 436
Umesh Awasthi
  • 23,407
  • 37
  • 132
  • 204

4 Answers4

2

How about google's guava cache? It is pretty simple to implement and very easy to use:

Guava Cache

Eugene
  • 117,005
  • 15
  • 201
  • 306
  • i know about it and its really good,but is it ok to add one more dependency for such small requirement – Umesh Awasthi Aug 11 '12 at 07:48
  • @UmeshAwasthi I have no idea in your case, but the thing is that if you add guava and look and it's capabilities you will use a lot more then just caching. It is a fantastic library, I am pretty sure you know that already. – Eugene Aug 11 '12 at 07:51
1

For the OmniFaces project, we had a similar requirement and there we settled for concurrentlinkedhashmap, with a small wrapper that tracks how long cached items are valid and purges them lazily from the cache whenever that item is requested.

Also see: How would you implement an LRU cache in Java?

Community
  • 1
  • 1
Arjan Tijms
  • 37,782
  • 12
  • 108
  • 140
1

If you don't want to add third party libs you could implement one yourself on top of a LinkedHashMap. LinkedHashMap is exactly suitable to be used as a cache and you can configure a policy (remove least recently used or oldest entry) If you configure it accordingly it will remove the oldest entry as you need.
And for thread-safety you can always use Collections#synchronizedMap().

Here is a small example

Cratylus
  • 52,998
  • 69
  • 209
  • 339
0

I believe LinkedHashMap is exactly what you need. You just need to override removeEldestEntry(...) method, and it will automatically remove old entries for you if the maximum capacity is reached. Something like:

import java.util.*;

class CacheMap<K,V> extends LinkedHashMap<K,V> {
    protected final int maxCapacity;
    public CacheMap(int maxCapacity) {
        this.maxCapacity = maxCapacity;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry eldest) {
        return size() > maxCapacity;
    }
}

You could implement a more sophisticated logic, for example remove very old entries even if the max. capacity is not reached.

If synchronizing atomic map operations is enough for you, you can just wrap the map into Collections.synchronizedMap(...):

Map<K,V> map = Collections.synchronizedMap(new CacheMap<K,V>(capacity));

If you need more accurate synchronization, for example read the map and update it in one synchronized block, you need to synchronize (all) code blocks that work with the map yourself.

Petr
  • 62,528
  • 13
  • 153
  • 317