Lots of supporting machinery is needed to make writing correct exception-safe code easy.
I'm surprised that more people didn't key into this line. This is the 'con' being discussed: Exception handling is expensive. The rest of the paragraph is just the details of why so much machinery is required.
This is a disadvantage of exceptions that is usually overlooked on dual-core 2GHz machines with 4GB of RAM, a 1TB hard drive, and gobs of virtual memory for every process. If the code is easier to understand, debug, and write, then buy/make faster hardware, and write bottlenecks without exceptions and in C.
However, on a system with tighter constraints, you can't ignore the overhead. Try this. Make a test.cpp file like this:
//#define USE_EXCEPTIONS
int main() {
int value;
#ifdef USE_EXCEPTIONS
try {
#endif
value++;
#ifdef USE_EXCEPTIONS
if (value != 1) {
throw -1;
}
}
catch (int i) {
return i;
}
#else
return -1;
}
#endif
return value;
}
As you can see, this code does next to nothing. It performs an increment on a static value.
Compile it anyways with
g++ -S -nostdlib test.cpp
and look at the resulting test.s assembly file. Mine was 29 lines long without the if (value != 1) { return -1 }
block, or 37 lines with the example return test block. Much of that was labels for the linker.
After you're satisfied with this code, uncomment the #define USE_EXCEPTIONS
option at the top, and compile again. Wham! 155 lines of code to handle the exception. I'll grant you that we now have an extra return
statement and an if
construct, but these are only a couple lines each.
This is far from a complete exception handling benchmark. See the ISO/IEC TR18015 Technical Report on C++ Performance, section 5.4, for a more authoritative and thorough answer. Do note that they start with the almost-as-trival example:
double f1(int a) { return 1.0 / a; }
double f2(int a) { return 2.0 / a; }
double f3(int a) { return 3.0 / a; }
double g(int x, int y, int z) {
return f1(x) + f2(y) + f3(z);
}
so there is merit in using absurdly small test cases. There are also StackOverflow threads here and here (where I pulled the above link from, courtesy Xavier Nodet).
This is the supporting machinery that they were talking about, and it's why 8GB of RAM will soon be standard, why processors will have more cores and run faster, and why the machine you're on now will be unusable. When coding, you should be able to peel the abstraction away in your head and think of what the line of code really does. Things like exception handling, run time type identification, templates, and the monstrous STL are expensive in terms of memory and (to a lesser degree) runtime. If you've got lots of memory and a blazing CPU, then don't worry about it. If not, then be careful.