0

why does Math.Round() return different results for different decimal numbers (that have differnet accuracy) in JS? for example

console.log(Math.round(-0.5000000000000001) === 1); // true
console.log(Math.round(-0.50000000000000001)  === 1); // false, is equal to 0

console.log(Math.round(0.4999999999999999) === 0); // true
console.log(Math.round(0.49999999999999999) === 0); // false, is equal to 1

it works ok for both positive and negative numbers, when the accuracy (number of digits after the decimal) is 16 or below, but for 17 and more it gives the other result

T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
  • I think JavaScript can only calculate up to 14 decimal places. It goes a bit weird past that point. – Joachim Jun 25 '22 at 13:39
  • maybe worth a look -> https://stackoverflow.com/questions/588004/is-floating-point-math-broken – Nina Scholz Jun 25 '22 at 13:41
  • thanks Joachim, so you mean that this is the language's defect? what is the computer term for such issues that a language could have? – hossein1976 Jun 25 '22 at 13:42
  • `Math.round(-0.5000000000000001) === 1` is `false`, not `true` as you claim in the question (`=== -1` would be `true`). But basically it's because [IEEE-754 binary double-precision floating point numbers](https://en.wikipedia.org/wiki/IEEE_754) are imperfect, and you're exceeding the bounds of what they can represent. The literal `-0.5000000000000001` does indeed denote a value just less than `-0.5`, but your second one with `-0.50000000000000001` (one more `0`) just represents `-0.5`. – T.J. Crowder Jun 25 '22 at 13:44

0 Answers0