When exceptions are thrown they rebuilt the stack trace among other things. This is pretty costly. If you can, try to prevent a known error condition like this (here C-style):
public static const int INVERSE_DIVISION_BY_ZERO = -1; // any value not possible on correct operation will do
public int Inverse(int a)
{
if (a != 0)
{
return 1 / a;
}
return INVERSE_DIVISION_BY_ZERO;
}
Now, this example is ugly I know. It doesn't help much.
In fact from testing first:
if (a != 0)
{
b = Inverse(a);
}
It is now a matter of testing later:
b = Inverse(a);
if (b != Inverse.INVERSE_DIVISION_BY_ZERO)
{
// safe to go
}
Does this make sense? No. The example is quite poor. But it shows that there is a minimum amount of testing to do to keep code execution safe. If it's not done before, it has to be done later, but can't really be avoided.
Exceptions should be thrown for error conditions that are exceptional in a sense that you can't control them like no CD in drive and similar. When you already know how the code fails, prevent it!
I want to add this:
Exception handling is a complex design problem not just an implementation problem. Their is a trade-off to take care off, between these two extremes:
- clean, readable code that throws exceptions on occasion;
- efficient code that is difficult to read but prevents exceptions and returns C-style error codes that need interpretation.
The common C# practice is to favor clean, readable code over premature optimization, and I share this vision.
But this does not mean that Exceptions should be deliberately thrown. There are cases where intentional throwing makes perfect sense, for example:
enum Gender
{
Unspecified,
Male,
Female
}
// later in the code
switch (gender)
{
case Gender.Unspecified:
// handle
break;
case Gender.Male:
// handle
break;
case Gender.Female:
// handle
break;
default:
throw new ArgumentException(string.Format("Unrecognized gender (was {0})", (int)gender));
}
The undefined gender can't be rendered as a string because it is not defined. It will be rendered as an int value, instead.
Now this example is in my opinion clean readable code that is also robust when the Gender enum is modified in future times. At least it will tell the developer that he forgot something...
Another example is this:
// A: high performance, let it throw
for (i = 0; i < values.length; i++)
{
values[i] = 1 / values[i];
}
// B: slow performance, test first
for (i = 0; i < values.length; i++)
{
if (values[i] != 0)
{
values[i] = 1 / values[i];
}
}
When values[i]
is 0 A will fail while B will ignore that spot in the array. Now I wonder about this:
- is the array useful with unhandled spots? maybe it should be thrown away at that point;
- is it really more performant to test instead of throw an exception?
- what if
0
is so seldom it happens once every 1000 years (really)? That's like saying never, but based on probabilistic estimations.
- maybe testing in each cycle for something that most probably will not happen is a waste...
If there is solid data to show that error conditions are extremely rare they should be handled with exceptions, because testing will cost much more on the average.
This doesn't mean you don't handle the exceptional state, it just means you handle it in a non-efficient way because it's rare and testing beforehand is costly.
EDIT: added error condition
EDIT 2: added some more thoughts