30

It turns out (outer a bit of thought it's more obvious but whatever) that BigInt recently introduced to javascript has a limit:

enter image description here

My question would be - is there a constant similar to Number.MAX_SAFE_INTEGER but for BigInt?

This snippet of code:

   let a = 2n, step = 1;
   try{while(true) {
      console.log(step); 
      a=a**2n; step++
   }} catch(e){ console.log(e)}

Shows that the limit is about (step = 32) - at least in Chrome. But I wonder what it this value as per spec.

shabunc
  • 23,119
  • 19
  • 77
  • 102
  • Is this actually part of a spec yet, or has Chrome just added in-built support for things that were previously available via js libraries? – James Thorpe Nov 16 '18 at 10:14
  • Looks like [it's a candidate spec](https://github.com/tc39/proposal-bigint) that Chrome (and a few others) are already supporting. – James Thorpe Nov 16 '18 at 10:18
  • 4
    From: https://developers.google.com/web/updates/2018/05/bigint // Highest possible BigInt value that can be represented as a // signed 64-bit integer. const max = 2n ** (64n - 1n) - 1n; – Grzesiek Danowski Nov 16 '18 at 10:19
  • @JamesThorpe Not sure it's something you could polyfill with a js lib, as it's also got primitive number format, something you could transpile of course. Babel seems to have the syntax plugin for it, but couldn't find the compiler part. – Keith Nov 16 '18 at 10:22
  • 3
    Seems, it is `(((1n << (2n**30n - 1n)) - 1n) << 1n) | 1n`. – Viktor Mukhachev Aug 17 '20 at 17:09

5 Answers5

26

It seems like there is no maximum limit to a BigInt as per spec, which makes sense considering BigInts are supposed to be arbitrary-precision integers, whose "digits of precision are limited only by the available memory of the host system".

As for v8 specifically, according to this article on the v8 blog, the precision of BigInts are "arbitrary up to an implementation-defined limit". Unfortunately, I couldn't find any further information on how the limit is determined. Maybe someone else would be able to shed light on this based on these v8 BigInt implementation notes?

That said, based on the aforementioned articles, there doesn't seem to be a specific maximum value/size for a BigInt. Rather, it is likely determined based on the available memory on the system in some way.

zwliew
  • 438
  • 6
  • 6
  • 1
    Correct, there is no spec-defined limit. Last I checked, Firefox and Safari both allowed up to 1 million bits, which is probably more than enough for all practical purposes. V8 (Chrome/Node/Opera/Edge/...) currently allows up to 1 billion bits. As the limit is implementation-defined, each engine could change it in the future (in either direction). – jmrk Apr 26 '23 at 08:57
9

The maximum size of a BigInt in webkit is defined as such

  // The maximum length that the current implementation supports would be
  // maxInt / digitBits. However, we use a lower limit for now, because
  // raising it later is easier than lowering it.
  // Support up to 1 million bits.
  static constexpr unsigned maxLength = 1024 * 1024 / (sizeof(void*) * bitsPerByte);

The size of void* is platform dependent, 8 on 64 bit systems.

So there's your answer right? Should be 16384 bits.... (-1 for the sign). But I can't create anywhere near that large a number in console.

Alec Joy
  • 1,848
  • 8
  • 17
  • 1
    In case anyone is wondering what size of 16384 bits means; it would be a `long` in Java, meaning it's 18.446.744.073.709.551.616 or 2^64, but I think it probably is unsigned, so that means it's 9.223.372.036.854.775.807 positive and 9.223.372.036.854.775.808 negative numbers – Keimeno Feb 22 '21 at 23:15
  • @Alec: the comment clearly says "Support up to 1 million bits". Creating such BigInts in the console is easy if you don't try to manually type out their decimal representation. For example, `BigInt('0x' + 'f'.repeat(250000))` will have exactly 1 million bits, so will `2n ** 999_999n`. – jmrk Aug 13 '21 at 10:33
  • 9
    @Keimeno: The limits you've posted are for 64-bit integers. A 16384-bit integer can get much bigger, up to 2^16383, which has 4933 decimal digits and hence doesn't fit into this comment. Since BigInts in WebKit support up to a million bits (currently; WebKit developers might change this in either direction), they can grow up to (2^1000000) - 1. – jmrk Aug 13 '21 at 10:37
  • 16384 bits (which gets you over 4900 decimal digits) occupies about 2KB of memory. The spec, however, implies that the limit should be guided by "available memory", suggesting that decent implementations should allow *much* more. In theory, ranges in the hundreds of millions of decimal digits, given that I have successfully used strings that are over 200 million characters in in-browser JavaScript implementations. Any limit lower than this would show that implementations are imposing their own much more conservative limits. – thomasrutter Jul 06 '22 at 07:46
  • @thomasrutter I know what the spec says, but in practice there has to be a limit defined somewhere. My response is taken from the source code of chromium to show what, at least at the time, the value actually was. – Alec Joy Jul 13 '22 at 20:36
7

The size has arbitrary precision, per 4.3.25BigInt value, though, oddly not mentioned in the 20.2 BigInt Objects section.

Here is a quick test program:

/* global BigInt */

let b = BigInt(10)
let exp = 1;
while (true) {
    console.log(`BigInt of 10^${exp} `);
    b = b * b;
    exp *= 2;
}

output with Node v13.2:

BigInt of 10^1
BigInt of 10^2
BigInt of 10^4
...
BigInt of 10^4194304
BigInt of 10^8388608
BigInt of 10^16777216

Performance really drags after about 10 to the millionth.

While there may be a platform specific maximum in a specific browser, the size requirement to show it would be large. Even 10^10^6 takes over 300K to store. You could extend the spec to add a limit with Tetration, e.g., "limit is about 10^10^10^... 8 times", but, seriously, that would be silly.

Charles Merriam
  • 19,908
  • 6
  • 73
  • 83
7

It turns out it's 2^30 - 1 bits, or 2^(2^30) - 1. Any more bits and it wouldn't work. (I was unable to get the actual value, since bit shifts can't do that) To put that in comparison, there are more digits in that number than the population in the US (almost 2x)!

(Wolfram Alpha link for the number of digits)

Very big indeed!

Tested on a modern Chrome browser.

Infigon
  • 554
  • 5
  • 18
  • 1
    Testing this in a Chromium based browser resulted in the console timing out until eventually the browser ran out of memory and crashed the page. It's quite interesting that it can detect a failure faster than display the result in the console. – Codingale Mar 30 '22 at 18:58
  • @Codingale You can compute the number of bits and based on that generate a `RangeError`. Computing the number of bits is fast (i.e. it's calculating `n` in `2**n`). – Alexis Wilke Jan 01 '23 at 17:35
0

So, running the following script in the console:


let n = 1n;
let p = 1;

while(p<10001) {
  n *= 1267650600228229401496703205376n // 2^100
  console.log({p: p++})
  console.log(n)
}

does not "error" even though I'm multiplying by 1267650600228229401496703205376n on each iteration, I easily got up to the equivalent of: 10000*100 bits (1,000,000 bits)

... however, the resulting value logged to the console is: 78102406353244117148310442417059631533061780115528…123691650081257805509908275407049892733047013376n, and I can't seem to retrieve any of the internal parts of the number (in the ...), so not sure how much use it is to me. In theory, it's still calculating fine. I'm sure I could have kept going. It took about 5 minutes or so to calculated all that up on my 2019 macbook pro.

(NOTE: I did not put in the while(true), because it really started to lock up chrome after a while. I could not stop the algorithm and I could only shut down chrome by force, I decided to put in a max value that seemed relatively high and definitely good enough for anything most of us would want to deal with in the foreseeable future).

Point is, it looks like you can handle pretty big numbers this way.

CStroliaDavis
  • 392
  • 4
  • 14
  • Okay, I played around with it to verify: `let n = 1n; let p = 1; while(p<10001) { n *= 1267650600228229401496703205376n // 2^100; p++; } console.log({p: p-1}) console.log(n); let x = 1; while(n>1) { n /= 2n; x++; } console.log({x: x-1}); ` helps prove the result (and it runs much faster) – CStroliaDavis Apr 23 '23 at 15:01
  • I recently found that using `toString` on the number should give you the whole thing. I decided not to get the whole string of the one I generated above because it would be over 300,000 characters long. – CStroliaDavis Apr 23 '23 at 15:14