6

I'm tweaking some code in a RationalNumber implementation. In particular, inside the equality logic, I'm considering the following:

public bool Equals(RationalNumber other)
{
   if (RationalNumber.IsInfinity(this) ||
       RationalNumber.IsInfinity(other) ||
       RationalNumber.IsNaN(this) ||
       RationalNumber.IsNaN(other))
   {
       return false;
   }

   try
   {
       checked
       {
           return this.numerator * other.Denominator == this.Denominator * other.numerator;
       }
   }
   catch (OverflowException)
   {
       var thisReduced = RationalNumber.GetReducedForm(this);
       var otherReduced = RationalNumber.GetReducedForm(other);
       return (thisReduced.numerator == otherReduced.numerator) && (thisReduced.Denominator == otherReduced.Denominator);
   }
}

As you can see I'm using exceptions as a flow control mechanism. The reasoning behind this is that I do not want to incurr in the penalty of evaluating the greatest common divisor of both fractions on every equality check. Thus I only decide to do it in the least probable case: one or both cross products overflow.

Is this an acceptable practice? I've always read that exceptions should never be used as a flow mechanism of your code, but I don't really see another way to achieve what I want.

Any alternative approaches are welcome.

Hamid Pourjam
  • 20,441
  • 9
  • 58
  • 74
InBetween
  • 32,319
  • 3
  • 50
  • 90

3 Answers3

2

The reasoning behind this is that I do not want to incur in the penalty of evaluating the greatest common divisor of both fractions on every equality check.

This is sound reasoning. The total cost of this code is

{probability of fast-path} * {fast-path cost}
+ ((1.0 - {probability of fast-path}) * {slow-path cost})

Depending on the three constants involved this will be a good or bad choice. You need to have a good understanding of that data will be processed in practice.

Note, that exceptions are very slow. I once benchmarked them to be 10000 per second per CPU core and I'm not sure they would scale to multiple cores due to internal CLR locks involved.

Maybe you can add runtime profiling. Track the rate of exceptions. If too hight, switch off the optimization.

You probably should document why you did this.

It's also not an architectural problem because in case you change your mind later you can easily switch to a different algorithm.

As an alternative, you could first compute and compare unchecked. If the result is "not equal" it is guaranteed that the exact result would be "not equal", too. Even if overflow occurred. So that could be an exception free fast path if many numbers turn out to be not equal.

usr
  • 168,620
  • 35
  • 240
  • 369
1

Usually catching exceptions has high overhead and you should catch exceptions if you can do something about them.

In your case you can do something about the exception. Using it as a control flow is not a problem in my opinion but I suggest you to implement the logic (check different conditions to prevent exceptions) then benchmark both options and compare the performance because usually catching exceptions has high overhead but if checking in order to prevent exceptions takes more time then handling the exception is the better way.

Update due to OPs comment(Its a new implementation, we are not using the .NET framework's Rational. The type of Numerator and Denominator is long)

you can use bigger types to prevent overflow exception like decimal or BigInteger

decimal thisNumerator = this.numerator;
decimal thisDenominator = this.numerator;
decimal otherNumerator = other.numerator;
decimal otherDenominator = other.numerator;

checked
{
    return thisNumerator * otherDenominator == thisDenominator * otherNumerator;
}

Update due to comments:

a simple example to show exception overhead.

const int Iterations = 100000;
var sw = new Stopwatch();
var sum1 = 0;
sw.Start();
for (int i = 0; i < Iterations; i++)
{
    try
    {
        var s = int.Parse("s" + i);
        sum1 += s;
    }
    catch (Exception)
    {
    }
}
sw.Stop();
Console.WriteLine(sw.ElapsedMilliseconds);
Console.WriteLine(sum1);

var sw2 = new Stopwatch();
var sum2 = 0;
sw2.Start();
for (int i = 0; i < Iterations; i++)
{
    try
    {
        int s;
        if (int.TryParse("s" + i, out s))
            sum2 += s;
    }
    catch (Exception)
    {
    }
}
sw2.Stop();
Console.WriteLine(sw2.ElapsedMilliseconds);
Console.WriteLine(sum2);

result is : handling exceptions are at least 170 times slower

5123
0
30
0

Hamid Pourjam
  • 20,441
  • 9
  • 58
  • 74
  • 3
    Thanks for the downvote. Would you please explain the reason so I can improve the answer and make the community a better place? – Hamid Pourjam Jun 26 '15 at 13:05
  • Thanks for your input. My question would be: what other choices do I have to know before hand if my "fast" equality check is going to overflow in order to avoid catching the exception? I don't see any. – InBetween Jun 26 '15 at 13:07
  • your question title is "Can exceptions used as a control flow mechanism be valid in some specific scenarios?" and at the end you said "Any alternative approaches are welcome." – Hamid Pourjam Jun 26 '15 at 13:08
  • 1
    Exception don't have a "high overhead". http://yoda.arachsys.com/csharp/exceptions.html – Tim Schmelter Jun 26 '15 at 13:09
  • 118ms for a single non I/O line of code is not high? @TimSchmelter – Hamid Pourjam Jun 26 '15 at 13:11
  • @dotctor: it is _throwing_ 118 exceptions _per millisecond_ – Tim Schmelter Jun 26 '15 at 13:12
  • 1
    @dotctor I can't make sense of your last comment. I quote your answer: *"(...) I suggest you to implement the logic (check different conditions to prevent exceptions) then benchmark both options and compare the performance"*. So it begs the question: what logic should I implement to prevent the exceptions? – InBetween Jun 26 '15 at 13:12
  • @InBetween, I don't see anything wrong with your current approach. I think what answer poster means is in generic case if catching exceptions could be avoided then we should do so. – Rahul Jun 26 '15 at 13:15
  • @TimSchmelter My question is not so much about *performance*, its more about coding *style and quality*. I am willing to take the performance penalty of using exceptions because I know my implementation will be considerably faster in the general case, and not that much slower in the corner cases where the exception is actually thrown. – InBetween Jun 26 '15 at 13:15
  • @dotctor: Your code does not tell anything about the overhead of exceptions in a real world application or in this specific case. Of course `int.Parse` is more efficient than using exceptions, otherwise MS had also used exception-handling in `int.Parse`. But consider that your test needs 10ms to check if an argument is valid but handling the possible exception takes 1ms, what is more efficient? – Tim Schmelter Jun 26 '15 at 13:33
  • That's what I'm trying to say, If checking to prevent exceptions take more time than handling it then handle it! And I don't understand what you said about the code, the part with `TryParse` does a lot more than the part which throws exception and it is faster than that. – Hamid Pourjam Jun 26 '15 at 13:40
  • @dotctor: the `TryParse` part does not more. My point is that of course `Int.TryParse` is more efficient than `int.Parse`+exception, otherwise microsoft had implemented `int.TryParse` using exception handling [what is not the case](http://stackoverflow.com/a/15294922/284240). But you can't deduce anything from the fact. I could also show you infinite examples where the test if an argument is valid(like `int.TryParse`) takes more time than just try it out and handle the possible exception. All the more if that exception is very unlikely. – Tim Schmelter Jun 26 '15 at 13:46
  • I think both of us are saying one thing but in different ways. Yes of course it depends on the probability of the condition which cause the exception and usually handling the exception is better in time than the checking and take less effort to implement. @TimSchmelter – Hamid Pourjam Jun 26 '15 at 13:49
0

This approach is introduced in MSDN. https://msdn.microsoft.com/en-Us/library/74b4xzyw.aspx

But catching exception is high overhead because process mode will change user-mode to kernel-mode in that time, maybe.