When I try 0.14 * 100
in browser console (result is same in all browsers), I get 14.000000000000002
as a result. If I, however, try 0.14 * 1000
, it correctly returns 140
.
If it returned approximate results in both cases, I'd just assume that 0.14 cannot be exactly represented in binary, but this doesn't seem to be the case.
EDIT: no, it is not duplicate of Is floating point math broken? - they use 0.1 + 0.2, which evaluates to imprecise number. Meanwhile, 0.14 is completely precise, the problem only occurs when multiplying it by 10 / 100, but it works with 1000.
EDIT: sorry, it's not precise, but the point stands - why does it return whole number when multiplied by 1000?