I'm trying to divide two decimal values that produce a fraction, then multiplying by an integer (whole number) to produce an accurate integer.
Here are a couple of examples of the current outputs I'm experiencing. b and c are decimal values because they can contain precision values, but is not relevant to this example:
code:
int a = 720;
decimal b = 8;
decimal c = 12;
var value = (a * (b / c));
return (int)value;
Example:
Case 1:
720 * (8 / 12) = (int)480
Case 2:
480* (8 / 24) = 159 <-- Expecting 160
I have tried to apply rounding to fix case 2, but this generates a problem in case 1 where the value becomes 721. I can gather this is happening because when we divide the decimals inside the parentheses, values like .666..7 or .333...4 are generated, which does not always smoothly translate into an integer. Often, I'm one whole number above or below what I'm expecting.
How can I get accurate results in this case?