I'm building a cache and want to unit test it using real multithreading approach.
I declare two fileds:
private final Object LOCK = new Object();
private final Map<MaterialSpec, Material> materialsCache = new HashMap<>();
Then, I have two methods to get Material from cache and to clear Material from cache:
private Material getAndCacheMaterial(MaterialSpec materialSpec) throws OperationFailedException {
synchronized (LOCK) {
if (materialsCache.containsKey(materialSpec)) {
return materialsCache.get(materialSpec);
}
}
Material material = materialProvider.getMaterial(materialSpec);
synchronized (LOCK) {
materialsCache.put(material.getMaterialSpec(), material);
if (materialSpec.isDefault()) {
// cache as "default" material in addition to a specific material
materialsCache.put(materialSpec, material);
}
}
return material;
}
public void clearMaterialFromCache(String materialId) {
synchronized (LOCK) {
for (MaterialSpec materialSpec : materialsCache.keySet()) {
if (materialId.equals(materialSpec.getMaterialId())) {
materialsCache.remove(materialSpec);
}
}
}
}
As you can see, all accesses to materialsCache
Map are synchronized (there are no other places where this field is used). However, despite that, I'm getting ConcurrentModificationException in my unit test:
java.util.ConcurrentModificationException
at java.base/java.util.HashMap$HashIterator.nextNode(HashMap.java:1493)
at java.base/java.util.HashMap$KeyIterator.next(HashMap.java:1516)
at com.a24z.materials.MaterialsManager.clearMaterialFromCache(MaterialsManager.java:129)
My unit test is indeed multithreaded (but I don't think its code is relevant here), so, if the implementation wouldn't be thread-safe, it's reasonable to expect this exception. However, given I synchronized all accesses to this Map, I just can't understand how concurrent modification can happen in this case.
Why do I get this exception?