I am university student. I am learning python and create some works. Recently, I find below differences.
">>> 0.57*100"
"56.99999999999999"
">>> 0.58*100"
"57.99999999999999"
">>> 0.56*100"
"56.00000000000001"
">>> 0.55*100"
"55.00000000000001"
above code is executed in python interpreter. my python version is 3.7.7. And my python is Cpython.
I multiplied any other two decimal point number such 0.54 ,0.19, 0.99 etc...
But only 0.58 and 0.57 become like 57.999999... or 56.999999999... when multiplied by 100.
And only 0.56*100 and 0.55*100 become like 56.00000000000001 or 55.00000000000001 when multiplied by 100.
All of most two decimal point number such as 0.19 ,0.99 become correct value :19.0, 99.0.
These differences cause only in my computer? or every one computer? And if every python environment has these differences, why happens?