In JavaScript, in order to know whether a number is an integer or not, we can directly use the method
Number.isInteger(10); //return true
Number.isInteger(20); //return true
Number.isInteger(20.10); //return false
However, if I pass 1.0
in it doesn't return me what I am expecting.
Number.isInteger(1.0); //return true
I want to treat 1.0
as floating point number and not a integer number. Is there any way in JavaScript to differentiate between 1 as int
and 1.0 as float
?