Using Chrome, I setup a XMLHttpRequest
:
const xhr = new XMLHttpRequest();
xhr.open(method, url, true);
...
xhr.onreadystatechange = () => {
if (xhr.readyState === XMLHttpRequest.DONE) {
if (isStatusCodeSuccess(xhr.status)) {
// handle response;
} else {
// handle error
}
}
};
xhr.addEventListener('progress', event => {
console.log(event.loaded, event.target.response.length);
});
xhr.send(data);
At some point I request some JSON data which total size is around 295MB uncompressed / 5.8 MB compressed. Unfortunately, when the following code is called on success, the response is an empty string.
Using the progress event listener, I can see that the response is correctly handled chunk by chunk, up to a certain point. Here is an excerpt of the console logs it produces:
32768 32768
4639135 4639135
7376739 7376739
11525403 11525403
...
261489180 261489180
264684030 264684030
267880247 267880247
271037819 0
274232442 0
277428774 0
...
304018210 0
309230213 0
310445469 0
It looks like there is a string/allocation limitation on Chrome but I don't receive any error.
On firefox I receive the following error: InternalError: allocation size overflow
.
I tried storing the result as it comes but I can't "empty" the Xml Http Request object as attributes are readonly.
What is the official limitation for variables in browsers? I could not find an official answer, just some experimental ones Javascript string size limit: 256 MB for me - is it the same for all browsers?
Is there any way to workaround this issue and handle large POST result? apart from using WebSocket.
I am thinking of:
- specific content type to stream the data, like the
application/octet-stream
- specific parameter to handle the data by chunk, like the range header although the content I'm fetching is not static (it changes over time)
- known algorithm to reduce the length of the JSON response, where I can unzip chunk by chunk the data to avoid hitting the above limitation