I'm writing a class that save the state of the connected components of a graph, supporting dynamic connectivity, and, each time a new edge is removed or added, I have to recalculate the neighbouring components to join or split them.
The only exceptions that those methods can throw is std::bad_alloc
. No other exception will be thrown by any of my dependencies. So, the only possible exceptions are due to lack of memory by methods like std::unordered_set<...>::insert
or std::deque<...>::push_back
.
It complicates a lot the design of my algorithms, because I've to deal with local data to save differences, and then move all modifications based on those cached modifications in a well-scoped try-catch
block.
The readability is severaly decreased and the time to think and write this exception-safe code increases a lot. Also, memory overcommit makes dealing with this exceptions a bit pointless.
What do you do in situations like that? Is it really important to ensure exception-safe code given that, if there is a real lack of memory, your code will probably fail anyway, but probably later, and the program as a whole will as well?
So, in short, is it worthy to deal with lack-of-memory exceptions at all, considering that, as one comment points out, that the very same exception throwing mechanism could exhaust memory as well?