So, running the following script in the console:
let n = 1n;
let p = 1;
while(p<10001) {
n *= 1267650600228229401496703205376n // 2^100
console.log({p: p++})
console.log(n)
}
does not "error" even though I'm multiplying by 1267650600228229401496703205376n
on each iteration, I easily got up to the equivalent of: 10000*100 bits (1,000,000 bits)
... however, the resulting value logged to the console is:
78102406353244117148310442417059631533061780115528…123691650081257805509908275407049892733047013376n
, and I can't seem to retrieve any of the internal parts of the number (in the ...), so not sure how much use it is to me. In theory, it's still calculating fine. I'm sure I could have kept going. It took about 5 minutes or so to calculated all that up on my 2019 macbook pro.
(NOTE: I did not put in the while(true)
, because it really started to lock up chrome after a while. I could not stop the algorithm and I could only shut down chrome by force, I decided to put in a max value that seemed relatively high and definitely good enough for anything most of us would want to deal with in the foreseeable future).
Point is, it looks like you can handle pretty big numbers this way.