In a long code of 600 lines I have one part of the code that calculates something weird.
idl = 0
print type(dl), dl
idl = int(dl*10)+1
print idl
This returns:
<type 'float'> 0.1
1
This calculation is done in a definition in my code. This is obviously not the expected result. The weird thing is, when I copy the code above in a separate python file:
idl = 0
dl = 0.1
print type(dl), dl
idl = int(dl*10)+1
print idl
I get:
<type 'float'> 0.1
2
What could be the origin of this problem? I've extracted these parts, to make the problem simple, but if you want I can give more information.