In terms of concurrency your solution will work fine. The ConcurrentHashMap is thread safe so it could be used by many threads.
I just recommend you to be careful with memory leaks, because your cache could grow without limit if you don't define a procedure to clean it.
Also you can define your own cache. For instance, I like to use LRU (Least Recently Use) queues for my cache objects. Using a LRU queue you will be sure that the cache won't grow without limit and that the elements that will be removed from the cache will be those that were used more time ago.
You can implement a LRU queue, for example a possible implementation could be:
public class MyLRUHashMap<K, V> extends LinkedHashMap<K, V> {
private static final long serialVersionUID = -3216599441788189122L;
private int maxCapacity = 1000;
protected MyLRUHashMap(int capacity) {
super(capacity);
maxCapacity = capacity;
}
protected MyLRUHashMap(int capacity, boolean accessOrder) {
super(capacity, 0.75f, accessOrder);
maxCapacity = capacity;
}
@Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return (this.size() >= maxCapacity);
}
}
In order to create a thread safe variable of "MyHashMap" you have to instantiate it in this way:
Map<Object, Object> MyCacheMap = Collections.synchronizedMap(new MyLRUHashMap(capacity, accessOrder));
Hope it helps! :)