Put simply my question is:
If I try 0.1+0.2!==0.3
in javascript, it will return TRUE
. But for 0.1+0.3!==0.4
, it will return FALSE
.
Why?
When I searched on google I found that the javascript engine uses IEEE 754 format for floating point numbers. It does not have a concept of an integer.
Why does it behave differently in the two examples, above?