I am getting different results for the double and decimal calculations...
double value1 = 280.585 - 280.50;
decimal value2 = Convert.ToDecimal(280.585) - Convert.ToDecimal(280.50);
Console.WriteLine(value1);
Console.WriteLine(value2);
Output:
Double:0.0849999999999795
Decimal:0.085
But how come int and long give the same results?
int value1 = 2+2;
long value2 = 2+2;
Console.WriteLine(value1);
Console.WriteLine(value2);
Output:
4
4