0

I want to load multiple large large files. to do this I pinched the code from

javascript FileReader - parsing long file in chunks

Each chunk is given a part number and passed to a Python CGI program also give the total number of chunks to expect. The Python CGI concatenates the chunks when the full set of chunks are received into the desired file. This works.

Now I want to load multiple files with each file broken into chunks as before. Each file has an id number so Python will know to store it in a different directory however I run out of memory and it crashes. In

Reading multiple files with Javascript FileReader API one at a time

There are suggestion how to convince Javascript to process one file at a time sequentially rather than asynchronously but my attempts to get this code to work with my current code have been unsuccessful.

The file splitting code is below. The variable numparts terminates the recursive call.

Can anyone suggest how to modify the code intended by the loop at the bottom to execute setupReader sequentially please.

 function setupReader(file,filename) {
    var fileSize = (file.size - 1);
    var chunkSize = 150000000;
    var offset     = 0;
    var numparts= Math.ceil(fileSize/chunkSize);
    var chunkReaderBlock = null;
    var partno=0;
    var readEventHandler = function(evt) {
       if (evt.target.error == null) {
           offset += chunkSize;
           callback(evt.target.result); // callback for handling read chunk
       } else {
           console.log("Read error: " + evt.target.error);
           alert("File could not be read");
           return;
       }
       if (partno >= numparts) {
           console.log("Done reading file");
           return;
       }
       chunkReaderBlock(offset, chunkSize, file);
    }

    callback=function(result) {
       partno+=1;
       var bfile = result;
       var query="?fileName="+filename+"&Part="+partno+"&of="+numparts+"&bfile="+bfile;
      loadDoc('partition0',query,0);
   }

   chunkReaderBlock = function(_offset, length, _file) {
      var r = new FileReader();
      var blob = _file.slice(_offset, length + _offset);
      r.onload = readEventHandler;
      r.readAsDataURL(blob);
   }
   chunkReaderBlock(offset, chunkSize, file);
 }

 for (var i = 0; i < numfiles; i++) {
    setupReader(fileList[i],fileList[i].name);
Community
  • 1
  • 1
Keir
  • 557
  • 1
  • 6
  • 17

1 Answers1

0

I don't know python but isn't there a streaming solution to handle large file uploads? Perhaps like this: Python: HTTP Post a large file with streaming ?

In that case splitting the hole file into chunks and making multiple range request gets pointless

Community
  • 1
  • 1
Endless
  • 34,080
  • 13
  • 108
  • 131
  • Wish I'd come across this before writing all this stuff. Anyway I'll give it go and let you know how I get on – Keir Sep 14 '16 at 23:36