I'm using the MapMaker to implement caching of data objects in my application:
public class DataObjectCache<DO extends MyDataObject> {
private final ConcurrentMap<String, DO> innerCache;
public DataObjectCache(Class<DO> doClass) {
Function<String, DO> loadFunction = new Function<String, DO>() {
@Override
public DO apply(String id) {
//load and return DO instance
}
};
innerCache = new MapMaker()
.softValues()
.makeComputingMap(loadFunction);
}
private DO getDataObject(String id) {
return innerCache.get(id);
}
private void putDataObject(DO dataObject) {
innerCache.putIfAbsent(dataObject.getID(), dataObject);
}
}
One of these DataObjectCaches would be instantiated for each data object class, and they would be kept in a master Map, using the Class objects as keys.
There's a minority of data object classes whose instances I don't want cached. However I would still like them to be instantiated by the same code, which the Function is calling, and would still need concurrency in regard to loading them distinctly.
In these cases, I'm wondering if I can just set the maximum size of the map to 0, so that entries are evicted immediately, but still take advantage of the atomic computing aspects of the map. Is this a good idea? Inefficient?
EDIT:
I realized that if I evicted entries immediately after loading them, there's no way to guarantee they are distinctly loaded - if the Map isn't keeping track of them, multiple instances of an object with the same ID could be floating around the environment. So instead of doing this, I think I'll use weak values instead of soft values for the types of objects I don't want taking up cache - let me know if anyone has an opinion on this.