I am trying to capture the download progress of a Fetch request and use that to change the width of a progress bar. I looked at ProgressEvent.lengthComputable as a potential solution but unsure if this can be used with the Fetch API.
Asked
Active
Viewed 1.6k times
20
-
1Not true. The promise from a fetch() resolves after the first packet has been received, but doesn't wait until the whole body is there. – Touffy Nov 16 '17 at 20:55
-
1https://stackoverflow.com/questions/36453950/upload-file-with-fetch-api-in-javascript-and-show-progress – Adrian Nov 16 '17 at 20:58
-
1then https://stackoverflow.com/questions/35711724/progress-indicators-for-fetch would be better besides it's older – skyboyer Nov 16 '17 at 20:59
-
Can't flag as duplicate because of the bounty, but it's all there. – Touffy Nov 16 '17 at 21:02
-
Adriani6 Touffy thanks a lot for that information – skyboyer Nov 16 '17 at 21:15
-
5Let's reopen because this question is download specific, and the suggested duplicate answer is upload-specific – anthumchris May 01 '20 at 01:25
-
The answer you're looking for is here: https://stackoverflow.com/a/70623370/10030693 – Gilbert Apr 13 '23 at 11:31
3 Answers
20
without checking for errors (as in try/catch etc...)
const elStatus = document.getElementById('status');
function status(text) {
elStatus.innerHTML = text;
}
const elProgress = document.getElementById('progress');
function progress({loaded, total}) {
elProgress.innerHTML = Math.round(loaded/total*100)+'%';
}
async function main() {
status('downloading with fetch()...');
const response = await fetch('https://fetch-progress.anthum.com/30kbps/images/sunrise-baseline.jpg');
const contentLength = response.headers.get('content-length');
const total = parseInt(contentLength, 10);
let loaded = 0;
const res = new Response(new ReadableStream({
async start(controller) {
const reader = response.body.getReader();
for (;;) {
const {done, value} = await reader.read();
if (done) break;
loaded += value.byteLength;
progress({loaded, total})
controller.enqueue(value);
}
controller.close();
},
}));
const blob = await res.blob();
status('download completed')
document.getElementById('img').src = URL.createObjectURL(blob);
}
main();
<div id="status"> </div>
<h1 id="progress"> </h1>
<img id="img" />
adapted from here

gman
- 100,619
- 31
- 269
- 393
-
1This doesn't work if the server sends compressed encoded response. For example gzip compressed. Say the client sends `Accept-Encoding: gzip` header, and server responds with - `Content-Type: application/json` `Content-Encoding: gzip` `Content-Length: xxx` then the length `xxx` will much smaller than the total length of chunks while reading from the body reader. Basically `loaded` will be more than `total` after a certain point. Because the `content-length` header contains the size of the compressed response. But `loaded` is the chunk size calculated after decompressing. – ecthiender Feb 18 '22 at 13:33
-
Apparently you're out of luck for compressed stuff period. There's no standard header that sends the size of the compressed data. If you control the server you could send a custom header with that info. – gman Feb 18 '22 at 17:33
-
**Progress incorrect when content is gzip encoded**: https://github.com/AnthumChris/fetch-progress-indicators/issues/13 / **Incorrect progress for gzip encoded response**: https://github.com/samundrak/fetch-progress/issues/22 – Mir-Ismaili Jul 07 '22 at 21:20
3
Using this utility:
async function* streamAsyncIterable(stream) {
const reader = stream.getReader()
try {
while (true) {
const { done, value } = await reader.read()
if (done) return
yield value
}
} finally {
reader.releaseLock()
}
}
Then you can use for await...of
loop:
const response = await fetch(url)
let responseSize = 0
for await (const chunk of streamAsyncIterable(response.body)) {
responseSize += chunk.length
}
But be aware that responseSize
is response-size! Not necessarily download-size! What is the difference? There is no difference if there is no content-encoding
(gzip
, br
, ...). But if a comperession was applied, final download-size will be the size of compressed data (the same content-length
), and final response-size will be the size of uncompressed data.
See @ecthiender comment and this thread.

Mir-Ismaili
- 13,974
- 8
- 82
- 100
-
This seems like a good answer. I have one issue. I tried to call foo = await response.json() after and it failed because the body stream already read. is there a way to get the json from the response at the end? – dooderson Oct 14 '22 at 21:50
-
2@dooderson; AFAIK, in that case, you need to do that manually. I mean you need to read (and concatenate) `totalBytes` (in the same `for await` loop). Then you need to convert `totalBytes` to `string` and then `JSON.parse()` it. Don't convert every `chunk` to `string` (then concatente `string`s)! This causes some issues with multi-byte characters that may be placed at the boundaries of two sequent chunks. – Mir-Ismaili Oct 17 '22 at 10:38
-5
you can use axios instead
import axios from 'axios'
export async function uploadFile(file, cb) {
const url = `//127.0.0.1:4000/profile`
try {
let formData = new FormData()
formData.append("avatar", file)
const data = await axios.post(url, formData, {
onUploadProgress: (progressEvent) => {
console.log(progressEvent)
if (progressEvent.lengthComputable) {
let percentComplete = progressEvent.loaded / progressEvent.total;
if (cb) {
cb(percentComplete)
}
}
}
})
return data
} catch (error) {
console.error(error)
}
}

Haile
- 127
- 1
- 8