I was playing around with python when I tried
>>> Decimal(0.1)
Decimal('0.1000000000000000055511151231257827021181583404541015625')
which I thought was normal because of the floating point inaccuracy. I also expected that 0.1 * 10 would be slightly greater than 1.0
I then tried
>>> Decimal(0.1 * 10)
Decimal('1')
>>> 0.1 * 10 == 1.0
True
which is weird because they shouldn't be equal.
>>> sum = 0.0
>>> for i in range(10):
sum += 0.1
>>> Decimal(sum)
Decimal('0.99999999999999988897769753748434595763683319091796875')
which is also weird because it's supposed to be slightly greater than 1.0
can someone explain this to me.
I am not sure if this is relevant but I used python 3.5.2 and python 2.7.12 on windows 8.1 64-bit.