?(1.0-0.9-0.1)
-0.000000000000000027755575615628914
?((double)1.0-(double)0.9-(double)0.1)
-0.000000000000000027755575615628914
?((double)1.0-(double)0.9-(double)0.1).GetType()
{Name = "Double" FullName = "System.Double"}
?((double)1.0-(double)0.9-(double)0.1).ToString()
"-2,77555756156289E-17"
How a Double.ToString()
displays more chars(32) that double's precision(15-16)?
I expect that MyObject.ToString()
represents just MyObject and not MyObject+SomeTrashFromComputer
Why
?0.1
0.1
?0.2-0.1
0.1
?0.1-0.1
0.0
BUT
?1.0-0.9-0.1
-0.000000000000000027755575615628914
WHY
?1.0-0.1-0.9
0.0
BUT
?1.0-0.9-0.1
-0.000000000000000027755575615628914