There is this excellent question for basic random numbers in JavaScript within a specific range:
Generating random whole numbers in JavaScript in a specific range?
function getRandomInt(min, max) {
min = Math.ceil(min);
max = Math.floor(max);
return Math.floor(Math.random() * (max - min + 1)) + min;
}
How do you do the same thing with BigInt in plain JavaScript (i.e. not with Node.js or using crypto.randomBytes
)? Something that works across environments (with BigInt support)?
(Thinking out loud...) You can't just change the numbers in that formula to be appended with n
to make them BigInts, because Math.random()
returns a non-BigInt. An example of Math.random()
returns 0.5427862726372646
, that is 16 decimal places. But if I have a BigInt that is like 41223334444555556666667777777888888889999999997n
, that is a 47 digit number, so multiplying 0.5427862726372646 * (10**46)
gives 5.4278627263726465e+45
. Wrap that in BigInt
and you get BigInt(0.5427862726372646 * (10**46))
equal to 5427862726372646467145376115182187925303459840n
. Hmmm... Is that the solution?
41223334444555556666667777777888888889999999997n
5427862726372646467145376115182187925303459840n
How did that work out? If that is the solution I just stumbled upon it just now trying to ask this question. Can you double check this is correct and confirm, and perhaps explain how a regular JavaScript number (with e+45
representation), when passed to the BigInt
constructor, results in a seemingly accurately detailed BigInt
value? So let me try then.
BigInt(Math.random() * (10 ** 45))
// => 329069627052628509799118993772820125779492864n
Hmmm. I don't understand, Math.random()
was only to 16 digits?
const rand = Math.random()
// => 0.7894008119121056
BigInt(rand * (10 ** 45))
// => 789400811912105533187528403423793891092987904n
That doesn't make sense, I would have expected:
789400811912105600000000000000000000000000000
That is, 0.7894008119121056 shifted over 45 decimal places.
Not sure why this is working, is this only in Chrome?
So one final test:
console.log('10 ** 35 <=> 10 ** 45')
console.log(rint(10 ** 35, 10 ** 45).toString())
console.log('10 ** 35 <=> 10 ** 45')
console.log(rint(10 ** 35, 10 ** 45).toString())
console.log('10 ** 20 <=> 10 ** 40')
console.log(rint(10 ** 20, 10 ** 40).toString())
console.log('10 ** 20 <=> 10 ** 40')
console.log(rint(10 ** 20, 10 ** 40).toString())
function rint(min, max) {
return BigInt(Math.random() * max - min + 1) + BigInt(min)
}
I am getting e.g.:
10 ** 35 <=> 10 ** 45 485253180777775593983353876860021068179439616n
10 ** 35 <=> 10 ** 45 178233587725359997576391063983941630941986816n
10 ** 20 <=> 10 ** 40 8245114695932740733462636549119006474240n
10 ** 20 <=> 10 ** 40 7214182941774644957099293094661617352704n
Is this a correct implementation then? How would you implement this then to get a random integer from 0 to an arbitrarily large BigInt then?