1

Put simply my question is:

If I try 0.1+0.2!==0.3 in javascript, it will return TRUE. But for 0.1+0.3!==0.4, it will return FALSE. Why?

When I searched on google I found that the javascript engine uses IEEE 754 format for floating point numbers. It does not have a concept of an integer.

Why does it behave differently in the two examples, above?

Liam
  • 27,717
  • 28
  • 128
  • 190
Vimal Bera
  • 10,346
  • 4
  • 25
  • 47
  • 1
    Not sure if it is the same in JavaScript but in many languages decimal variables are not reliable for comparison. The reason is that the binary way of representing a decimal, as used by many languages, implies some lack of precision. – Johnride Jan 28 '14 at 13:46
  • @Johnride's comment is correct, I found that article a good read (although its a designer Magazine): http://coding.smashingmagazine.com/2011/05/30/10-oddities-and-secrets-about-javascript/ (it's explained under Miscellaneous) – joewhite86 Jan 28 '14 at 13:48
  • 3
    what exactly are the -2's for? You could at least add a comment. God the secretive downvoters bug me. How is anyone supposed to improve their question if you never say why you're downvoting them? grr – Liam Jan 28 '14 at 13:58

0 Answers0