22

when i trying to read file using FileReader and the file size is 5.9gb and when this code run

var file = document.getElementById('uploadFileId').files[0];
   let reader = new FileReader();
   reader.onerror  = function() {
       console.log(reader.error);
   } 
   reader.onload = function(e) {
        console.log(" e.target.result ",e.target.result);
    }
    reader.readAsArrayBuffer(file);

then above error is generate in angularjs. here i want to achieve that multipart file want to divide in to 5mbs chunks and send to server.

satish wanjari
  • 241
  • 1
  • 3
  • 4

3 Answers3

7

I'm getting the same message, but only for files over 2GB. Seems as though there is a file size limit that triggers this unhelpful message.

Glenn Johnson
  • 101
  • 1
  • 3
  • It'd be really nice if we could find some sort of official documentation of such a limit. I'm getting this error in Chrome and Safari, but not Firefox. Firefox will read the file correctly. – thelr Apr 14 '22 at 21:11
  • 3
    @thelr this seems to relate to the ArrayBuffer limit as seen here: https://stackoverflow.com/questions/17823225/do-arraybuffers-have-a-maximum-length – user5480949 Jul 25 '22 at 20:32
3

This seems related to the Chrome 2GB ArrayBuffer size limit (other browsers have higher limits).
One solution is to upload the file chunks and then save them all to a file on the server:

const writableStream = new WritableStream({          
  start(controller) { },
  async write(chunk, controller) {
    console.log(chunk);
    // upload the chunks here
  },
  close() { },
  abort(reason) { },
});

const stream = e.target.files[0].stream();
stream.pipeTo(writableStream);
user5480949
  • 1,410
  • 1
  • 15
  • 22
0

This error only happens when you are changing the file after or during reading process

Abdul Rehman
  • 96
  • 1
  • 3