Discrepancies in Python decimals Item1 = 0.0015700639764578807 Item2 = 0.9984299360235422
total = Item1 + Item2
When adding Item1 and Item2, the actual value should be 1, but for some reason the total variable outputs a incorrect value of 1.0000000000000002
Why is this? What are some ways around it? I can always use round() but I need to know if total is actually > 1. So if the total value is actually something like 1.05 using round() would just round it down so it's not ideal.
Another example which "works" is Item1 = 0.0012024048096192384 Item2 = 0.9987975951903808
These 2 add up properly via python as 1 but the above example doesn't. Very strange
Any ideas?
I was expecting a value for total to be 1 instead of 1.0000000000000002