I've noticed a weird behaviour while adding two double values, one negative and one positive.
var eighteenPointFive = -18.5;
var nineteenPointFour = 19.4;
var eighteenPointFour = -18.4;
var nineteenPointFive = 19.5;
Console.WriteLine(eighteenPointFive + nineteenPointFour); // prints 0.899999999999999 (why not 0.9?)
Console.WriteLine(eighteenPointFour + nineteenPointFive); // prints 1.1
Why is there a difference in the number of digits after the decimal point in both cases?.