56

I have the following HTML Code:

<input type='file' multiple>

And Here's my JS Code:

var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
    var fr = new FileReader();
    for(var i = 0; i < inputFiles.files.length; i++){
        fr.onload = function(){
            console.log(i) // Prints "0, 3, 2, 1" in case of 4 chosen files
        }
    }
    fr.readAsDataURL(inputFiles.files[i]);
}

So my question is, how can I make this loop synchronous ? That is first wait for the file to finish loading then move on to the next file. Someone told me to use JS Promises. But I can't make it to work. Here's what I'm trying:

var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
    for(var i = 0; i < inputFiles.files.length; i++){
        var fr = new FileReader();
        var test = new Promise(function(resolve, reject){
            console.log(i) // Prints 0, 1, 2, 3 just as expected
            resolve(fr.readAsDataURL(inputFiles.files[i]));
        });
        test.then(function(){
            fr.onload = function(){
                console.log(i); // Prints only 3
            }
        });
    };
}

Thanks in advance...

Kamil Kiełczewski
  • 85,173
  • 29
  • 368
  • 345
Zahid Saeed
  • 1,101
  • 1
  • 8
  • 14
  • 4
    Promises are used for asynchronous operations. – Michał Perłakowski Dec 28 '15 at 15:02
  • How do I make it synchronous then ? I have studied on internet, they all say that it makes the code synchronous – Zahid Saeed Dec 28 '15 at 15:04
  • 1
    @ZahidSaeed: No, promises don't make code synchronous. Can you point to one of those places that "all" say they do? – T.J. Crowder Dec 28 '15 at 15:10
  • 2
    MDN Says: "A Promise represents a proxy for a value not necessarily known when the promise is created. It allows you to associate handlers to an asynchronous action's eventual success value or failure reason. This lets asynchronous methods return values like SYNCHRONOUS METHODS": – Zahid Saeed Dec 28 '15 at 15:22
  • 3
    Yes. You can make it behave in a synchronous fashion if by synchronous you mean that you can control the order in which the different parts of your file reading code are executed. But it will still run asynchronously in respect to e.g. mouse handlers or the JavaScript engine itself. "like a synchronous method" is not the same as "is a synchronous method". – Tesseract Dec 28 '15 at 15:27
  • I added an answer to a similar question which might help here. It not only uses promises, but it allows the file to be easily added to a JSON object and passed around: https://stackoverflow.com/a/51543922/1414170 – tkd_aj May 06 '21 at 02:34

8 Answers8

80

We modified midos answer to get it to work the following:

function readFile(file){
  return new Promise((resolve, reject) => {
    var fr = new FileReader();  
    fr.onload = () => {
      resolve(fr.result )
    };
    fr.onerror = reject;
    fr.readAsText(file.blob);
  });
}
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
Jens Lincke
  • 911
  • 6
  • 6
  • 2
    I tried A LOT of different solutions and this was the only one that worked for me calling an API from Blazor. Thank you!!! – João Pedro Sousa Mar 28 '20 at 18:54
  • I adapted the answer to this in a RxJS pipe: ```switchMap( file => new Promise( (resolve, reject) => Object.assign( new FileReader(), { onload: e => resolve( e.currentTarget.result ), onerror: reject }).readAsDataURL(file) )),``` – loop Jun 11 '21 at 10:15
  • I don't really like `fr.onerror = reject` - the parameter to onerror is a `ProgressEvent` object whereas reject should be called with an `Error` – Andy Apr 13 '22 at 10:09
36

If you want to do it sequentially( not synchronously) using Promises, you could do something like:

var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = function(){
  var promise = Promise.resolve();
  inputFiles.files.map( file => promise.then(()=> pFileReader(file)));
  promise.then(() => console.log('all done...'));
}

function pFileReader(file){
  return new Promise((resolve, reject) => {
    var fr = new FileReader();  
    fr.onload = resolve;  // CHANGE to whatever function you want which would eventually call resolve
    fr.onerror = reject;
    fr.readAsDataURL(file);
  });
}
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
mido
  • 24,198
  • 15
  • 92
  • 117
  • 2
    you're the man! The 2nd part (new Promise) is exactly what I needed to get my FileReader to return as a Promise as needed. Thank you! – skplunkerin Jul 06 '16 at 21:30
  • 1
    I had to do `new Promise((resolve, reject) => {` to get this to work... (Needed the `` generic.) – Vaccano Jul 07 '16 at 17:38
  • dude. can you update your answer with just how to execute the pFileReader(file) and get the result from promise in a variable instead of inputfiles.files.map because am new to this and struggling with the solution. but your sol looks great. what i am trying to do is to call a method to convert image to base64 and return it to the caller and i save the data into a variable. pls help – Thameem Apr 08 '20 at 13:28
31

Preface: This answer originally written in 2015 shows wrapping FileReader in a promise. That's still a perfectly valid way to do the readAsDataURL operation the question asked about, but if you were going to use readAsText or readAsArrayBuffer (in general, new code shouldn't use the older readAsBinaryString), you'd want to use the File object's built-in promise-based methods text or arrayBuffer instead (or possibly stream if you want to do inline processing of the data as it flowed through), all of which are inherited from Blob.


The nature of FileReader is that you cannot make its operation synchronous.

I suspect you don't really need or want it to be synchronous, just that you want to get the resulting URLs correctly. The person suggesting using promises was probably right, but not because promises make the process synchronous (they don't), but because they provide standardized semantics for dealing with asynchronous operations (whether in parallel or in series):

Using promises, you'd start with a promise wrapper for readAsDataURL (I'm using ES2015+ here, but you can convert it to ES5 with a promise library instead):

function readAsDataURL(file) {
    return new Promise((resolve, reject) => {
        const fr = new FileReader();
        fr.onerror = reject;
        fr.onload = () => {
            resolve(fr.result);
        }
        fr.readAsDataURL(file);
    });
}

Then you'd use the promise-based operations I describe in this answer to read those in parallel:

Promise.all(Array.prototype.map.call(inputFiles.files, readAsDataURL))
.then(urls => {
    // ...use `urls` (an array) here...
})
.catch(error => {
    // ...handle/report error...
});

...or in series:

let p = Promise.resolve();
for (const file of inputFiles.files) {
    p = p.then(() => readAsDataURL(file).then(url => {
        // ...use `url` here...
    }));
}
p.catch(error => {
    // ...handle/report error...
});

Inside an ES2017 async function, you could use await. It doesn't do much for the parallel version:

// Inside an `async` function
try {
    const urls = await Promise.all(Array.prototype.map.call(inputFiles.files, readAsDataURL));
} catch (error) {
    // ...handle/report error...
}

...but it makes the series version simpler and clearer:

// Inside an `async` function
try {
    for (const file of inputFiles.files) {
        const url = await readAsDataURL(file);
        // ...use `url` here...
    }
} catch (error) {
    // ...handle/report error...
}

Without promises, you'd do this by keeping track of how many outstanding operations you have so you know when you're done:

const inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = () => {
    const data = [];    // The results
    let pending = 0;    // How many outstanding operations we have

    // Schedule reading all the files (this finishes before the first onload
    // callback is allowed to be executed). Note that the use of `let` in the
    // `for` loop is important, `var` would not work correctly.
    for (let index = 0; index < inputFiles.files.length; ++index) {
        const file = inputFiles.files[index];
        // Read this file, remember it in `data` using the same index
        // as the file entry
        const fr = new FileReader();
        fr.onload = () => {
            data[index] = fr.result;
            --pending;
            if (pending == 0) {
                // All requests are complete, you're done
            }
        }
        fr.readAsDataURL(file);
        ++pending;
    });
};

Or if you want for some reason to read the files sequentially (but still asynchronously), you can do that by scheduling the next call only when the previous one is complete:

// Note: This assumes there is at least one file, if that
// assumption isn't valid, you'll need to add an up-front check
var inputFiles = document.getElementsByTagName("input")[0];
inputFiles.onchange = () => {
    let index = 0;

    readNext();

    function readNext() {
        const file = inputFiles.files[index++];
        const fr = new FileReader();
        fr.onload = () => {
            // use fr.result here
            if (index < inputFiles.files.length) {
                // More to do, start loading the next one
                readNext();
            }
        }
        fr.readAsDataURL(file);
    }
};
T.J. Crowder
  • 1,031,962
  • 187
  • 1,923
  • 1,875
10

I upgrade Jens Lincke answer by add working example and introduce async/await syntax

function readFile(file) {
  return new Promise((resolve, reject) => {
    let fr = new FileReader();
    fr.onload = x=> resolve(fr.result);
    fr.onerrror = reject;
    fr.readAsDataURL(file) // or readAsText(file) to get raw content
})}

function readFile(file) {
  return new Promise((resolve, reject) => {
    let fr = new FileReader();
    fr.onload = x=> resolve(fr.result);
    fr.readAsDataURL(file) // or readAsText(file) to get raw content
})}

async function load(e) {
  for(let [i,f] of [...e.target.files].entries() ){
    msg.innerHTML += `<h1>File ${i}: ${f.name}</h1>`;
    let p = document.createElement("pre");
    p.innerText += await readFile(f);
    msg.appendChild(p);
  }
}
<input type="file" onchange="load(event)" multiple />
<div id="msg"></div>
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
Kamil Kiełczewski
  • 85,173
  • 29
  • 368
  • 345
3

Promisified FileReader

/**
 * Promisified FileReader
 * More info https://developer.mozilla.org/en-US/docs/Web/API/FileReader
 * @param {*} file
 * @param {*} method: readAsArrayBuffer, readAsBinaryString, readAsDataURL, readAsText
 */
export const readFile = (file = {}, method = 'readAsText') => {
  const reader = new FileReader()
  return new Promise((resolve, reject) => {
    reader[method](file)
    reader.onload = () => {
      resolve(reader)
    }
    reader.onerror = (error) => reject(error)
  })
}

Usage

const file =  new File(["foo"], "foo.txt", {
  type: "text/plain",
});

// Text
const resp1 = await readFile(file)
console.log(resp1.result)

// DataURL
const resp2 = await readFile(file, 'readAsDataURL')
console.log(resp2.result)
Jelle Hak
  • 409
  • 4
  • 6
1

Using promises can make it much more elegant,

// opens file dialog waits till user selects file and return dataurl of uploaded file

async function pick() {
  var filepicker = document.createElement("input");
  filepicker.setAttribute("type","file");
  filepicker.click();
  return new Promise((resolve,reject) => {
    filepicker.addEventListener("change", e => {
      var reader = new FileReader();
      reader.addEventListener('load', file => resolve(file.target.result));
      reader.addEventListener('error', reject);
      reader.readAsDataURL(e.target.files[0]);
    });
  });
}

// Only call this function on a user event

window.onclick = async function() {
  var file = await pick();
  console.log(file);
}
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
0

Here is another modification to Jens' answer (piggybacking off Mido's answer) to additionally check the file size:

function readFileBase64(file, max_size){
        max_size_bytes = max_size * 1048576;
        return new Promise((resolve, reject) => {
            if (file.size > max_size_bytes) {
                console.log("file is too big at " + (file.size / 1048576) + "MB");
                reject("file exceeds max size of " + max_size + "MB");
            }
            else {
            var fr = new FileReader();  
            fr.onload = () => {
                data = fr.result;
                resolve(data)
            };
            fr.readAsDataURL(file);
            }
        });
    }
JasonZiolo
  • 27
  • 4
  • This answer correctly uses `onloadend` instead of `onload` which may give partial results. – nathan-m Apr 24 '19 at 06:06
  • 1
    ?? No, I was saying that `onloadend` is the right one to use. Every other answer uses the _wrong_ api (`onload`) – nathan-m Sep 12 '19 at 05:33
  • 2
    `onloadend` triggers if the file was read successfully or unsuccessfully, while `onload` triggers only on successful read @nathan-m – jiroch Oct 31 '19 at 21:47
0

We can use the callback function to get the reader.result

function myDisplay(some) {
    document.getElementById('demo').innerHTML = some;
}


function read(file, callback) {
  const reader = new FileReader();

  reader.onload = () => {
    callback(reader.result);
  }

  reader.readAsText(file);
}

// When you pass a function as an argument, remember not to use parenthesis.
read(this.files[0], myDisplay);
Boontawee Home
  • 935
  • 7
  • 15