Description of the issue:
Here is a (I hope reproducible) issue I encountered with a simple Python 3.6.7 function (on Ubuntu 18.04 4.15.0-43-generic x86_64 GNU/Linux):
def myfunc(x):
while x < 0.2:
y = x
print("x: %.2f, y: %.2f: " % (x,y))
print("Test: {}".format((x>=0 and x<=1 and y>=0 and y<=1)))
x += 0.1
Then, when starting from -0.4:
myfunc(-0.4)
it returns:
x: -0.40, y: -0.40:
Test: False
x: -0.30, y: -0.30:
Test: False
x: -0.20, y: -0.20:
Test: False
x: -0.10, y: -0.10:
Test: False
x: -0.00, y: -0.00:
Test: False
x: 0.10, y: 0.10:
Test: True
x: 0.20, y: 0.20:
Test: True
And when starting with -0.2:
myfunc(-0.2)
it returns:
x: -0.20, y: -0.20:
Test: False
x: -0.10, y: -0.10:
Test: False
x: 0.00, y: 0.00:
Test: True
x: 0.10, y: 0.10:
Test: True
The behavior at 0.00 is not the same (False
when starting with x=-0.4
, True
when starting with x=-0.2
).
This is obscure to me.
Do you know what phenomena happens here?