0

Using the FileReader.readAsArrayBuffer() I read some content from a binary file and I got an ArrayBuffer.

Now I have to send the RAW binary read using a JQuery $.ajax.

The FileReader.readAsBinaryString() function is deprecated because marked as non-standard. I tried differents solutions also proposed here but alters the data due to binary -> UTF transformations.

Any hint is strongly apreciated, other than cycling through the array and adding each byte to the string to send.

Thanks for help! :-)

Francesco Piraneo G.
  • 882
  • 3
  • 11
  • 25
  • Why do you need to send the raw binary ? If you're able to use FileReader, you already have the blob, just send it. If you really want to read the binary then you can use `XHR.open('get', URL.createObjectURL(blob))` – Kaiido Sep 22 '16 at 08:34
  • 1. To send a single 1Mb block of large file (> 1Gb) without modifying the php.ini limit where is not possible; 2. Execute a file upload with resume in case of file xfer failure; 3. Introduce a file checksum of each block and promptly recover in case of error. For short integrate on my web application some features that every file uploader integrate without needs of external library, because I have to make some reworks on a file once uploaded. – Francesco Piraneo G. Sep 22 '16 at 13:22
  • 1
    Ok, that would be good to include these info in an [edit] to your question. For 1, you could send only chunks of the big Blob (cf `blob.slice()`), and then [merge them server-side](http://stackoverflow.com/questions/36045690/merging-file-chunks-in-php). For 2. Above will work too. For 3. Not sure how you could achieve it, but reading and parsing a 1GB file on front side sounds like a bad idea IMM. – Kaiido Sep 22 '16 at 13:40
  • I already read the file with blob.slice() and I get an ArrayBuffer **but** I canot feed the array buffer directly to a JQuery's AJAX call: I have to convert it in something more usable! Not an UTF string: Data will be altered! – Francesco Piraneo G. Sep 22 '16 at 14:16
  • 1
    But why don't you just send these sliced Blobs (which are just chunks of the original non converted file)? For cheksum, I think you'll need to loop through the arrayBuffer. (Here again I don't get why you explicitely say you don't want to do it). – Kaiido Sep 22 '16 at 14:23
  • Because the arrayBuffer content is not directly accessible (= cannot be sent); it must be converted to an array. If this is not correct, can you point me to a piece of code with a practical example? – Francesco Piraneo G. Sep 22 '16 at 20:17
  • You probably already save all your chunks in an array. Then, you can read these chunks as arraybuffer if you want. But your chunks are still there in the array. FileReader doesn't modify the Blob passed to it. So **just send the Blobs** not the arrayBuffer. – Kaiido Sep 22 '16 at 22:54

3 Answers3

0

Have you tried the FileReader.readAsText method? It allows to specify the encoding but uses UTF-8 by default.

See https://developer.mozilla.org/en-US/docs/Web/API/FileReader/readAsText

EDIT

To send a file with jQuery, you don't need to read it first. You can simply send it using a FormData object.

See How can I upload files asynchronously?.

Community
  • 1
  • 1
  • UTF-8 Alters data! It should be binary-safe! – Francesco Piraneo G. Sep 21 '16 at 17:03
  • UTF-8 doesn't alter data. A string in a given encoding is just a representation of a binary stream. If the file you read using the FileReader contains text, you need to know its encoding to be able to convert it back correctly as string. Have you tried with ASCII? – Manuel Guilbault Sep 21 '16 at 17:21
  • Just tested again: Original file is a PNG of 170'040 bytes but the FileReader.readAsText return a string 306'478 long... Please explain what do you refers as "ASCII" because is not a valid encoding. – Francesco Piraneo G. Sep 21 '16 at 19:55
0

Just feels stupid using the FileReader... at all. You don't have to read the file content to send it.

var blob = new Blob(['abcdef']) // simulate a file
var chunkSize = 2 // bytes to upload in each request
var chunks = [] // result will be 3 chunks [ab, cd, ef]
var offset = 0 // byte offset

// build the chunk array
do {
    chunks.push( blob.slice(offset, offset + chunkSize) )
    offset += chunkSize
} while (offset <= blob.size)

// upload the chunk one at the time
chunks.reduce((uploaded, nextChunk) =>
    uploaded_chunk.then(() =>
        fetch('/upload', {
            method: 'post',
            body: nextChunk
        })
}, Promise.resolve()).then(() => {
    // all chunks has been uploaded in sequential order
})

You probably don't want to use jQuery ajax either since it requires you to use .ajax (not .post) and you need to set processData & contentType to false and more

Look at this how to send a blob with jQuery: How can javascript upload a blob?

fetch is IMO better then jQuery's ajax and is built in to the browser. just sometimes need a polyfill


Edit: Best would be if you could change the php.init to allow larger upload so you don't have to do all of this hack to upload larger files

Community
  • 1
  • 1
Endless
  • 34,080
  • 13
  • 108
  • 131
  • I have to read the file content to: - Monitor and feedback the progress of the file upload; - Send and make a checksum of each sent block; - Resend the block in case of error; - On the server maybe I don't have access to php.ini and I don't like to realize I failed an upload after sending 8Gb of file. For that reason I cannot use the proposed solution. – Francesco Piraneo G. Sep 22 '16 at 21:48
  • @FrancescoPiraneoG., the only thing missing here is the progress handling, for this feature you'll need XHR. But the remaining of the answer stays the same. For resend in case of error, with `fetch` just use `.catch` and for XHR `onerror`. – Kaiido Sep 23 '16 at 05:06
  • yikes... I would recommend [WebTorrent](https://webtorrent.io/) here + some hybrid server side torrent client, Will do checksum, pause, resume etc... or go all NodeJS style with streams & pipes and then also listen at the Progress event Anyways... perhaps: https://github.com/23/resumable.js is something for you? – Endless Sep 23 '16 at 08:56
0

First, read as ArrayBuffer: It seems the most flexible solution:

var reader = new FileReader();
reader.readAsArrayBuffer(blob);

...and on callback convert to BLOB:

$(reader).on("loadend", function(evt) {
        if (evt.target.readyState === FileReader.DONE) { // DONE == 2
            blob = new Blob([evt.target.result]);
        }
    });

Finally send with a form:

    var fd = new FormData();
    fd.append('xferID', xferID);
    fd.append('data', blob);

    $.ajax({
        url: 'http://......',
        data: fd,
        processData: false,
        contentType: false,
        type: "POST",

        success: function(resultJSON) {...},
        error: function(resultJSON) {...}
    }

This is obviously just an outline; on PHP side you'll receive the BLOB data on the $_FILES superglobal.

Hopes this help. :-)

Francesco Piraneo G.
  • 882
  • 3
  • 11
  • 25