I came up with a formula to keep numbers a simple int
and was surprised to see that the following was producing a 0.
var x = 45;
var y = 100;
var z = Convert.ToDecimal(x / y * 100);
In the above example, it almost seems ridicilous that I'm dividing by 100
, then multiplying by 100
but in most cases, the x
and y
values are not nice integers.
When my x
and y
values end up being integers, the above conversion produces a 0
. Why is that?