0

I'm trying to divide two decimal values that produce a fraction, then multiplying by an integer (whole number) to produce an accurate integer.

Here are a couple of examples of the current outputs I'm experiencing. b and c are decimal values because they can contain precision values, but is not relevant to this example:

code:

int a = 720;
decimal b = 8;
decimal c = 12;
var value = (a * (b / c));
return (int)value; 

Example:

Case 1:

720 * (8 / 12) = (int)480

Case 2:

480* (8 / 24) = 159 <-- Expecting 160

I have tried to apply rounding to fix case 2, but this generates a problem in case 1 where the value becomes 721. I can gather this is happening because when we divide the decimals inside the parentheses, values like .666..7 or .333...4 are generated, which does not always smoothly translate into an integer. Often, I'm one whole number above or below what I'm expecting.

How can I get accurate results in this case?

Sprawl63
  • 113
  • 1
  • 8

1 Answers1

2

Casting decimal to int simply drops the fraction, it does not apply rounding.

You need to use Math.Round(decimal) before casting:

return (int)Math.Round(value);

Or decimal.Round(decimal):

return (int)decimal.Round(value);

Note that values ending .5, don't always round up (unless you specify MidpointRounding.AwayFromZero):

The integer nearest the d parameter. If the fractional component of d is halfway between two integers, one of which is even and the other odd, the even number is returned. Note that this method returns a Decimal instead of an integral type.

mjwills
  • 23,389
  • 6
  • 40
  • 63
Johnathan Barclay
  • 18,599
  • 1
  • 22
  • 35