3

I know Math.random() is smaller than 1, but the problem is, Math.random() generates floating point numbers, and floating point addition may have rounding errors.

So my question is, is there any possible value of n which n+Math.random() < n+1 is false?

ocomfd
  • 4,010
  • 2
  • 10
  • 19

3 Answers3

3

The largest random result is the largest double that is strictly less than 1.0. Adding one to it gets a real number result that is exactly half way between 2.0 and the largest double that is less than 2.0. Round to nearest will round it to 2.0, because it is a tie and 2.0 is even.

You need to allow for n+Math.random() < n+1 being false due to rounding.

Patricia Shanahan
  • 25,849
  • 4
  • 38
  • 75
2

Yes, there are a few - once the magnitude of the number is high enough, the interpreter won't necessarily be able to tell the difference between n and n + 1:

const verify = n => console.log(n + Math.random() < n + 1);
[
  Number.MAX_SAFE_INTEGER + 1,
  Infinity,
  -Infinity
].forEach(verify);

Though, this isn't so much a Math.random quirk so much as it is a JS number precision quirk.

CertainPerformance
  • 356,069
  • 52
  • 309
  • 320
-3

If Math.random() = 1 then n+Math.random() = n+1

while Math.random() is in the range 0–1 (inclusive of 0, but not 1)

So it's not possible that n+Math.random() < n+1 is false.

  • 1
    This would be good reasoning if the question were about real number arithmetic, but it is, in effect, asking about IEEE 754 64-bit binary arithmetic, which has its own rules. – Patricia Shanahan Oct 29 '18 at 08:57