3

I'm making a simple file uploader with end to end encryption. I have the uploading/encryption down but I am not sure how to save larger files without causing a memory overload. Let's say for example I have a 2gb file, I'll decrypt it in chunks of, say, 1kb. But once I have all these chunks (either as buffers or blobs) how can I save them locally as a stream? I found this

var json = JSON.stringify(data),
blob = new Blob([json], {type: "octet/stream"}),
url = window.URL.createObjectURL(blob);
a.href = url;
a.download = fileName;
a.click();
window.URL.revokeObjectURL(url);

but with big files I think that would just cause the browser to crash.

Edit: I found this which says the memory limit for blobs if 500MiB, which is definitely too low. Do I have any options here?

Community
  • 1
  • 1
Sneaky Beaver
  • 705
  • 2
  • 8
  • 15
  • 2
    It's unclear what "upload a file to the browser" means. Usually browsers download files from a server or upload files to a server. It's also unclear where you're getting chunks from. I think we need so see some of your coding attempt at the problem to have a better idea what you're trying to do. FYI, per guidelines here, your question should not rely on an external link in order to be understood. Please add enough relevant info to your question (preferably your attempt at coding this) so we can understand what you have so far. – jfriend00 Dec 18 '16 at 08:44
  • I updated it with some more info. – Sneaky Beaver Dec 18 '16 at 08:53
  • 1
    [to read a Blob/FIle as ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/API/FileReader/readAsArrayBuffer) But if you can directly send Blob to your node lib, then just use `Blob.slice(start, length)`. to reconstruct from blob slices, `new Blob([blobSlice1, blobSlicen])` – Kaiido Dec 18 '16 at 08:56
  • I'm not seeing a way to save it – Sneaky Beaver Dec 18 '16 at 08:57
  • So once I make the new blob, how would I go about saving it as a file with the correct name/extension? Or does that just get included in the buffer? – Sneaky Beaver Dec 18 '16 at 09:01
  • I found this http://jsfiddle.net/koldev/cW7W5/ but I'm worried about a memory overload on big files. Is there another way to do this? – Sneaky Beaver Dec 18 '16 at 09:03
  • Well once you've got the arrayBuffer, you can just do `new Blob([arraybuffer], options)` – Kaiido Dec 18 '16 at 09:10

1 Answers1

0

You can concatenate all your arrayBuffers or Blobs in a single call to new Blob([all,your,chunks]).

// a simple array to store all our chunks
var chunks = [];

function processChunks() {
  // concatenate all our chunks in a single blob
  var blob = new Blob(chunks);
  // do something with this final blob
  var img = new Image();
  img.src = URL.createObjectURL(blob);
  document.body.appendChild(img);
}

var xhr = new XMLHttpRequest();
xhr.onload = function() {
  var b = this.response;
  // here we slice our blob
  for (var i = 0; i < b.size; i += 1000) {
    chunks.push(b.slice(i, i + 1000));
  }
  console.log(chunks.length);
  processChunks();
}
xhr.responseType = 'blob';
xhr.open('get', 'https://dl.dropboxusercontent.com/s/nqtih1j7vj850ff/c6RvK.png');
xhr.send();
img {
  width: 100vw;
  height: 100vh
}
Kaiido
  • 123,334
  • 13
  • 219
  • 285