I have read that there is no integer type in Javascript, only floating point double-precision (64 bit) type. I understand that because of that there will be some floating point arithmetic errors introduced even when comparing integers.
However, I don't understand why the three comparisons below would yield different results:
var aInt = 1 + 2;
var bInt = 3;
console.log("Comparing integers: ");
console.log(aInt === bInt); // Prints true
var aNum = 0.1 + 0.2;
var bNum = 0.3;
console.log("Comparing 0.1 + 0.2 === 0.3: ");
console.log(aNum === bNum); // Prints false
var aNum2 = 1.0 + 2.0;
var bNum2 = 3.0;
console.log("Comparing 1.0 + 2.0 === 3.0: ");
console.log(aNum2 === bNum2); // Prints true
I would have expected all the above comparisons to evaluate to false
.