20

I am trying to capture the download progress of a Fetch request and use that to change the width of a progress bar. I looked at ProgressEvent.lengthComputable as a potential solution but unsure if this can be used with the Fetch API.

Mir-Ismaili
  • 13,974
  • 8
  • 82
  • 100
Jai Sandhu
  • 224
  • 1
  • 4
  • 19

3 Answers3

20

without checking for errors (as in try/catch etc...)

const elStatus = document.getElementById('status');
function status(text) {
  elStatus.innerHTML = text;
}

const elProgress = document.getElementById('progress');
function progress({loaded, total}) {
  elProgress.innerHTML = Math.round(loaded/total*100)+'%';
}

async function main() {
  status('downloading with fetch()...');
  const response = await fetch('https://fetch-progress.anthum.com/30kbps/images/sunrise-baseline.jpg');
  const contentLength = response.headers.get('content-length');
  const total = parseInt(contentLength, 10);
  let loaded = 0;

  const res = new Response(new ReadableStream({
    async start(controller) {
      const reader = response.body.getReader();
      for (;;) {
        const {done, value} = await reader.read();
        if (done) break;
        loaded += value.byteLength;
        progress({loaded, total})
        controller.enqueue(value);
      }
      controller.close();
    },
  }));
  const blob = await res.blob();
  status('download completed')
  document.getElementById('img').src = URL.createObjectURL(blob);
}

main();
<div id="status">&nbsp;</div>
<h1 id="progress">&nbsp;</h1>
<img id="img" />

adapted from here

gman
  • 100,619
  • 31
  • 269
  • 393
  • 1
    This doesn't work if the server sends compressed encoded response. For example gzip compressed. Say the client sends `Accept-Encoding: gzip` header, and server responds with - `Content-Type: application/json` `Content-Encoding: gzip` `Content-Length: xxx` then the length `xxx` will much smaller than the total length of chunks while reading from the body reader. Basically `loaded` will be more than `total` after a certain point. Because the `content-length` header contains the size of the compressed response. But `loaded` is the chunk size calculated after decompressing. – ecthiender Feb 18 '22 at 13:33
  • Apparently you're out of luck for compressed stuff period. There's no standard header that sends the size of the compressed data. If you control the server you could send a custom header with that info. – gman Feb 18 '22 at 17:33
  • **Progress incorrect when content is gzip encoded**: https://github.com/AnthumChris/fetch-progress-indicators/issues/13 / **Incorrect progress for gzip encoded response**: https://github.com/samundrak/fetch-progress/issues/22 – Mir-Ismaili Jul 07 '22 at 21:20
3

Using this utility:

async function* streamAsyncIterable(stream) {
  const reader = stream.getReader()
  try {
    while (true) {
      const { done, value } = await reader.read()
      if (done) return
      yield value
    }
  } finally {
    reader.releaseLock()
  }
}

See: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/for-await...of#iterating_over_async_generators

Then you can use for await...of loop:

const response = await fetch(url)
let responseSize = 0
for await (const chunk of streamAsyncIterable(response.body)) {
  responseSize += chunk.length
}

But be aware that responseSize is response-size! Not necessarily download-size! What is the difference? There is no difference if there is no content-encoding (gzip, br, ...). But if a comperession was applied, final download-size will be the size of compressed data (the same content-length), and final response-size will be the size of uncompressed data.

See @ecthiender comment and this thread.

Mir-Ismaili
  • 13,974
  • 8
  • 82
  • 100
  • This seems like a good answer. I have one issue. I tried to call foo = await response.json() after and it failed because the body stream already read. is there a way to get the json from the response at the end? – dooderson Oct 14 '22 at 21:50
  • 2
    @dooderson; AFAIK, in that case, you need to do that manually. I mean you need to read (and concatenate) `totalBytes` (in the same `for await` loop). Then you need to convert `totalBytes` to `string` and then `JSON.parse()` it. Don't convert every `chunk` to `string` (then concatente `string`s)! This causes some issues with multi-byte characters that may be placed at the boundaries of two sequent chunks. – Mir-Ismaili Oct 17 '22 at 10:38
-5

you can use axios instead

import axios from 'axios'
export async function uploadFile(file, cb) {
  const url = `//127.0.0.1:4000/profile`
  try {
    let formData = new FormData()
    formData.append("avatar", file)
    const data = await axios.post(url, formData, {
      onUploadProgress: (progressEvent) => {
        console.log(progressEvent)
        if (progressEvent.lengthComputable) {
          let percentComplete = progressEvent.loaded / progressEvent.total;
          if (cb) {
            cb(percentComplete)
          }
        }
      }
    })
    return data
  } catch (error) {
    console.error(error)
  }
}
Haile
  • 127
  • 1
  • 8