I little confused here. Do ArrayBuffer allocate a new memory region for it? If so, what would be the safe maximum Blob size to put on it?
-
I'm not sure I understand why you're asking this in the first place. Have you found a problem? As for Blob size limit, I couldn't find any, so I think it might be up to the memory of the client's computer and the server's upload limit. – J_A_X Jul 24 '13 at 02:14
-
Yes, on a basic test, a tried to allocate a large buffer like 512MB ( I testing limits ) in Firefox and it just crashed. – cavila Jul 25 '13 at 01:46
-
Also tried to allocate something similar on a watch variable at chrome inspector and chrome crashing too now when I open the inspector pane. Expecting browsers to launch some kind of error like OUT_OF_MEMORY. I am interested in use of file readers to process large files on client. Currently I could avoid crashes by slicing read of large files( huge images ) into smaller buffers. – cavila Jul 25 '13 at 01:57
-
I'm not finding anything about a standard limit, so I'm guessing it's up to the browser's discretion as well as the user's available RAM. – J_A_X Aug 06 '13 at 01:42
-
Thanks. Well, for now, better not allocate large blocks since browser's not making any checks. Using Blob.slice to divide work in chunks. – cavila Aug 06 '13 at 20:00
6 Answers
needed to know this myself, so i wrote a script that can search for the max value in the quickest possible way using binary search. (this is an adoption of the https://stackoverflow.com/a/35941703/1008999 but for bigints)
/**
* Binary search for a max value without knowing the exact value, only that it
* can be under or over It dose not test every number but instead looks for
* 1,2,4,8,16,32,64,128,96,95 to figure out that you thought about #96 from
* 0-infinity
*
* @example findFirstPositive(x => matchMedia(`(max-resolution: ${x}dpi)`).matches)
* @author Jimmy Wärting
* @see {@link https://stackoverflow.com/a/72124984/1008999}
* @param {function} f The function to run the test on (should return truthy or falsy values)
* @param {bigint} [b=1] Where to start looking from
* @param {function} d privately used to calculate the next value to test
* @returns {bigint} Intenger
*/
function findFirstPositive (f,b=1n,d=(e,g,c)=>g<e?-1:0<f(c=e+g>>1n)?c==e||0>=f(c-1n)?c:d(e,c-1n):d(c+1n,g)) {
for (;0>=f(b);b<<=1n);return d(b>>1n,b)-1n
}
const tries = []
const maxSize = findFirstPositive(x => {
tries.push(Number(x).toLocaleString())
try { new ArrayBuffer(Number(x)); return false } catch { return true }
})
console.log('found it in', tries.length, 'attempts')
console.log(Number(maxSize))
console.log(tries)
Here are some results on my MacOS
- Chrome 2145386496
- Safari 4294967296
- Firefox 8589934592

- 34,080
- 13
- 108
- 131
That only depends on your system and there doesn't seems to be a limit.
According to the specification :
If the requested number of bytes could not be allocated an exception is raised.

- 4,649
- 1
- 21
- 32
-
Document your link is pointing to does not conain words you cite. Do you mean ["If it is impossible to create such a Data Block, throw a RangeError exception."](https://www.ecma-international.org/ecma-262/6.0/#sec-createbytedatablock)? – Daerdemandt Jun 29 '17 at 13:35
-
@Daerdemandt My link is pointing to the "latest" version of the specifications, so I guess it just evolved, but yes, the sentence you quote sounds like the new reworded equivalent to the one in my answer. – Sebastien C. Jul 31 '17 at 10:24
what would be the safe maximum Blob size to put on it
There does not seem to be a hard limit, just whatever restrictions your platform imposes.
However, if one uses some sort of indexed access, indexes shouldn't be greater than Number.MAX_SAFE_INTEGER, because interesting bugs would happen otherwise.
Luckily, 2^53-1 bytes is around 8 petabytes so it shuoldn't be a concern unless you are doing something really weird.

- 2,281
- 18
- 19
Here's an updated answer, at least according to Mozilla in July 2019:
The length property of an Array or an ArrayBuffer is represented with an unsigned 32-bit integer, that can only store values which are in the range from 0 to (2^32)-1.
(from https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Errors/Invalid_array_length)
More details as of 2020-01-09:
Firefox seems to limit the size of the underlying buffer (so the limit is the number of bytes) whereas Chrome seems to limit the number of elements in a typed array. Also, Firefox's limit seems to be lower than what the mozilla link I posted says - according to the the link this
new ArrayBuffer(Math.pow(2, 32) - 1)
is a "valid case" but it throws a RangeError when run in the console (Firefox 72.0.1 64 bit on Windows 10)

- 2,307
- 1
- 27
- 39
I was also asking myself the same question. It seems that buffer is limited by available system memory and the ability of underlying JavaScript engine's GC to handle large amounts of memory. You can easily test it for your platform by creating large buffers in your browser's console and on the other side monitoring browser's process and overall memory footprint.
I just managed to create buffers larger than 512MB on Chrome 37. However I have 4GB of system memory so it could obviously lead to crash after more allocation. I'm not interested in finding breaking point since I'm fine with allocations up to 100MB, but you can obviously test it easily for yourself.

- 657
- 7
- 5
Mozilla has a preference for this to increase > 2Gb - javascript.options.large_arraybuffers
.
This is false by default, but should become true in Firefox 89

- 372
- 1
- 3
- 11