I want to cache a large number of Java objects (String, byte[]) with a composite key (String, int) in a cache like JCS or Infinispan.
The keys can be grouped by the string part (let's call it ID) of it:
KEY = VALUE
-------------
A 1 = valueA1
A 4 = valueA4
A 5 = valueA5
B 9 = valueB9
C 3 = valueC3
C 7 = valueC7
I need to remove elements grouped by the ID part of the key, so for example A should remove A 1, A 4 and A 5.
First I tried something like this:
final List<String> keys = cache.keySet()
.stream().filter(k -> k.getId().equals(id)).collect(Collectors.toList());
keys.forEach(cache::remove);
While this works, it is - not surprising - very expensive and thus slow.
So I tried another approach by using only the ID as key and group the values in a map:
KEY = VALUE
---------------------------------------------
A = {1 = valueA1, 4 = valueA4, 5 = valueA5}
B = {9 = valueB9}
C = {3 = valueC3, 7 = valueC7}
Removing a group is then very efficient:
cache.remove(id);
But putting requires a get:
Map<Integer, Value> map = cache.get(key.getId());
if (map == null) {
map = new HashMap<>();
}
map.put(key.getInt(), value);
cache.put(key.getId(), map);
Now there are less elements in the cache with a simpler key, but the values are larger and more complex. Testing with hundreds of thousands of elements in the cache, deletes are fast and puts and gets don't seem to be noticeably slower.
Is this a valid solution or are there better approaches?