I maintain a variable number of event listeners in a Map:
private final Map<String, EventListener> eventListeners = new ConcurrentHashMap<>();
The class where the map is used has a method to add an event listener:
public void addEventListener(final String name, final EventListener listener) {
eventListeners.put(name, listener);
}
Every time an event occurs I iterate over all listeners and fire them:
eventListeners.forEach((name, listener) -> {
listener.handle(event);
});
The environment is single-threaded and the part where it becomes interesting is that event listener implementations may insert another event listener when they are fired.
With an "ordinary" HashMap, this would obviously lead to a ConcurrentModificationException
, this is why I am using a ConcurrentHashMap
.
What I see happening is that an event listener that is inserted by another event listener may be fired by the same event since it is inserted during iteration, which I think is the expected behaviour of ConcurrentHashMap
.
However, I don't want this to happen, so I would like to defer any insertion of an event listener until iteration over all listeners has completed.
So I introduced a Thread
that waits for the iteration to complete before it inserts a listener using a CountDownLatch
:
public void addEventListener(final String name, final EventListener listener) {
new Thread(() -> {
try {
latch.await();
} catch (InterruptedException e) {}
eventListeners.put(name, listener);
}).start();
}
and the iteration over all listeners:
latch = new CountDownLatch(1);
eventListeners.forEach((name, listener) -> {
listener.handle(event);
});
latch.countDown();
It works as expected but I wonder if this is a good solution at all and if there are more elegant ways to achieve what I need.