Came across this, trying to understand why
Initially I thought this was due to type conversations between strings, ints and floats etc, but it appears not.
In JS doing 77.6 * 100, the outcome is 7759.999999999999.
If you do 77.6 * 1000 then the outcome is not 77599.999999999999, but instead is actually 77600.
(77.6*100) == (77.6*1000)/10 will give you false, but
(66.7*100) == (66.7*1000)/10 will give you true
This appears to only happen when 77.6 and 100 are involved, the same happens when you divide also.
I've tried this across Windows and Linux OS's, chrome js console, nodejs as well as PHP. All exhibit the same behaviour.
It's not exactly an insurmountable problem, but I am curious.
Any ideas?
Cheers