Can someone explain to me why the below code outputs 1100 in both cases?
decimal toEven = 1100.5m;
decimal awayFromZero = 1099.5m;
Console.WriteLine(Math.Round(toEven)); // output 1100
Console.WriteLine(Math.Round(awayFromZero)); // output 1100
It looks like that Math.Round() changes MidpointRounding strategy after the number 1100. If you use Math.Round() on decimals under 1100 with a .5 decimal Math.Round() uses the AwayFromZero MidpointRounding by default. But if you use decimals over 1100 Math.Round() will use the ToEven MidpointRound by default. Why?
I know i can set the MidpointRounding my self to fix the problem. I'm just curious why Math.Round() works like this.