0

I have read that there is no integer type in Javascript, only floating point double-precision (64 bit) type. I understand that because of that there will be some floating point arithmetic errors introduced even when comparing integers.

However, I don't understand why the three comparisons below would yield different results:

var aInt = 1 + 2;
var bInt = 3;
console.log("Comparing integers: ");
console.log(aInt === bInt); // Prints true

var aNum = 0.1 + 0.2;
var bNum = 0.3;
console.log("Comparing 0.1 + 0.2 === 0.3: ");
console.log(aNum === bNum); // Prints false

var aNum2 = 1.0 + 2.0;
var bNum2 = 3.0;
console.log("Comparing 1.0 + 2.0 === 3.0: ");
console.log(aNum2 === bNum2); // Prints true

I would have expected all the above comparisons to evaluate to false.

balajeerc
  • 3,998
  • 7
  • 35
  • 51

1 Answers1

0

At second example aNum returns 0.30000000000000004 , bNum returns 0.3

To return true at strict equality comparison , try using .toFixed() , Number()

var aNum = 0.1 + 0.2;
var bNum = 0.3;
console.log(Number(aNum.toFixed(1)), bNum);
console.log(Number(aNum.toFixed(1)) === bNum); 
guest271314
  • 1
  • 15
  • 104
  • 177