195

I'm struggling to find documentation or examples of implementing an upload progress indicator using fetch.

This is the only reference I've found so far, which states:

Progress events are a high level feature that won't arrive in fetch for now. You can create your own by looking at the Content-Length header and using a pass-through stream to monitor the bytes received.

This means you can explicitly handle responses without a Content-Length differently. And of course, even if Content-Length is there it can be a lie. With streams you can handle these lies however you want.

How would I write "a pass-through stream to monitor the bytes" sent? If it makes any sort of difference, I'm trying to do this to power image uploads from the browser to Cloudinary.

NOTE: I am not interested in the Cloudinary JS library, as it depends on jQuery and my app does not. I'm only interested in the stream processing necessary to do this with native javascript and Github's fetch polyfill.


https://fetch.spec.whatwg.org/#fetch-api

Community
  • 1
  • 1
neezer
  • 19,720
  • 33
  • 121
  • 220
  • 5
    @Magix See [Aborting a fetch: The Next Generation #447](https://github.com/whatwg/fetch/issues/447) – guest271314 Aug 07 '17 at 00:47
  • @guest271314 The link above is, again, for using streams in HTTP *responses*, not requests. – Armen Michaeli Feb 12 '21 at 16:30
  • 6
    Very disappointing to see that 4 years later there is still no solution using `fetch` API: https://fetch.spec.whatwg.org/#fetch-api `it is currently lacking when it comes to request progression (not response progression)` – fguillen Sep 25 '21 at 16:20
  • Modern browsers, no IE: https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream – Phil Tune Dec 20 '21 at 00:53
  • This link seems to have some concepts implemented of what the OP asks for: https://dev.to/tqbit/how-to-monitor-the-progress-of-a-javascript-fetch-request-and-cancel-it-on-demand-107f – pebox11 May 08 '23 at 14:25

13 Answers13

67

Streams are starting to land in the web platform (https://jakearchibald.com/2016/streams-ftw/) but it's still early days.

Soon you'll be able to provide a stream as the body of a request, but the open question is whether the consumption of that stream relates to bytes uploaded.

Particular redirects can result in data being retransmitted to the new location, but streams cannot "restart". We can fix this by turning the body into a callback which can be called multiple times, but we need to be sure that exposing the number of redirects isn't a security leak, since it'd be the first time on the platform JS could detect that.

Some are questioning whether it even makes sense to link stream consumption to bytes uploaded.

Long story short: this isn't possible yet, but in future this will be handled either by streams, or some kind of higher-level callback passed into fetch().

JaffaTheCake
  • 13,895
  • 4
  • 51
  • 54
56

fetch: Chrome only

Browsers are working on supporting a ReadableStream as the fetch body. For Chrome, this has been implemented since v105. For other browsers, it's currently not implemented.

(Note that duplex: "half" is currently required in order to use a stream body with fetch.)

A custom TransformStream can be used to track progress. Here's a working example:

warning: this code does not work in browsers other than Chrome

async function main() {
  const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
  const uploadProgress = document.getElementById("upload-progress");
  const downloadProgress = document.getElementById("download-progress");

  const totalBytes = blob.size;
  let bytesUploaded = 0;

  // Use a custom TransformStream to track upload progress
  const progressTrackingStream = new TransformStream({
    transform(chunk, controller) {
      controller.enqueue(chunk);
      bytesUploaded += chunk.byteLength;
      console.log("upload progress:", bytesUploaded / totalBytes);
      uploadProgress.value = bytesUploaded / totalBytes;
    },
    flush(controller) {
      console.log("completed stream");
    },
  });
  const response = await fetch("https://httpbin.org/put", {
    method: "PUT",
    headers: {
      "Content-Type": "application/octet-stream"
    },
    body: blob.stream().pipeThrough(progressTrackingStream),
    duplex: "half",
  });
  
  // After the initial response headers have been received, display download progress for the response body
  let success = true;
  const totalDownloadBytes = response.headers.get("content-length");
  let bytesDownloaded = 0;
  const reader = response.body.getReader();
  while (true) {
    try {
      const { value, done } = await reader.read();
      if (done) {
        break;
      }
      bytesDownloaded += value.length;
      if (totalDownloadBytes != undefined) {
        console.log("download progress:", bytesDownloaded / totalDownloadBytes);
        downloadProgress.value = bytesDownloaded / totalDownloadBytes;
      } else {
        console.log("download progress:", bytesDownloaded, ", unknown total");
      }
    } catch (error) {
      console.error("error:", error);
      success = false;
      break;
    }
  }
  
  console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>

workaround: good ol' XMLHttpRequest

Instead of fetch(), it's possible to use XMLHttpRequest to track upload progress — the xhr.upload object emits a progress event.

async function main() {
  const blob = new Blob([new Uint8Array(10 * 1024 * 1024)]); // any Blob, including a File
  const uploadProgress = document.getElementById("upload-progress");
  const downloadProgress = document.getElementById("download-progress");

  const xhr = new XMLHttpRequest();
  const success = await new Promise((resolve) => {
    xhr.upload.addEventListener("progress", (event) => {
      if (event.lengthComputable) {
        console.log("upload progress:", event.loaded / event.total);
        uploadProgress.value = event.loaded / event.total;
      }
    });
    xhr.addEventListener("progress", (event) => {
      if (event.lengthComputable) {
        console.log("download progress:", event.loaded / event.total);
        downloadProgress.value = event.loaded / event.total;
      }
    });
    xhr.addEventListener("loadend", () => {
      resolve(xhr.readyState === 4 && xhr.status === 200);
    });
    xhr.open("PUT", "https://httpbin.org/put", true);
    xhr.setRequestHeader("Content-Type", "application/octet-stream");
    xhr.send(blob);
  });
  console.log("success:", success);
}
main().catch(console.error);
upload: <progress id="upload-progress"></progress><br/>
download: <progress id="download-progress"></progress>
jtbandes
  • 115,675
  • 35
  • 233
  • 266
  • 1
    https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream – Phil Tune Dec 20 '21 at 00:52
  • Possible for request body or response body, or both? – Armen Michaeli Jan 20 '22 at 15:29
  • 1
    If you run the XHR example code above you'll see it works for both request and response body progress. These are separate event listeners on XMLHttpRequest. For `fetch()`, [`response.body`](https://developer.mozilla.org/en-US/docs/Web/API/Response/body) is a stream that can be used to track download progress. – jtbandes Jan 20 '22 at 21:35
54

My solution is to use axios, which supports this pretty well:

axios.request({
    method: "post", 
    url: "/aaa", 
    data: myData, 
    onUploadProgress: (p) => {
      console.log(p); 
      //this.setState({
          //fileprogress: p.loaded / p.total
      //})
    }
}).then (data => {
    //this.setState({
      //fileprogress: 1.0,
    //})
})

I have example for using this in react on github.

Willi Mentzel
  • 27,862
  • 20
  • 113
  • 121
dwjohnston
  • 11,163
  • 32
  • 99
  • 194
  • 4
    That was my solution as well. Axios seems to fit the mold really well. – Jason Rice Jul 02 '18 at 22:26
  • 6
    Does `axios` use `fetch` or `XMLHttpRequest` under-the-hood? – Dai Apr 04 '19 at 11:13
  • 13
    XMLHttpRequest. If you are using this for react native, beware that XMLHttpRequest seems to be VERY VERY slow to parse large json responses when compared to fetch (about 10 times slower, and it freezes the whole ui thread). – Cristiano Coelho Apr 07 '19 at 20:13
  • 1
    To get progress in % `this.setState({ fileprogress: Math.round( (p.loaded * 100) / p.total ) })` – Liam May 28 '19 at 00:01
  • 15
    This does not answer the question, especially because `axios` doesn't use `fetch` under the hood, and has no such support. I'm literally authoring it now _for_ them so. – Sam Gammon Apr 12 '20 at 22:39
  • 9
    I agree that this is not the solution for the specific question but having in consideration that there is not a solution for the specific question I vote up this answer. – fguillen Sep 25 '21 at 16:23
  • 6
    @DerekHenderson If the real answer is "you can't do x in y" then "do x in z instead" could be useful to many people. – ssp May 28 '22 at 15:24
  • also note that this does not work in nodejs – Telion Dec 05 '22 at 16:07
21

As already explained in the other answers, it is not possible with fetch, but with XHR. Here is my a-little-more-compact XHR solution:

const uploadFiles = (url, files, onProgress) =>
  new Promise((resolve, reject) => {
    const xhr = new XMLHttpRequest();
    xhr.upload.addEventListener('progress', e => onProgress(e.loaded / e.total));
    xhr.addEventListener('load', () => resolve({ status: xhr.status, body: xhr.responseText }));
    xhr.addEventListener('error', () => reject(new Error('File upload failed')));
    xhr.addEventListener('abort', () => reject(new Error('File upload aborted')));
    xhr.open('POST', url, true);
    const formData = new FormData();
    Array.from(files).forEach((file, index) => formData.append(index.toString(), file));
    xhr.send(formData);
  });

Works with one or multiple files.

If you have a file input element like this:

<input type="file" multiple id="fileUpload" />

Call the function like this:

document.getElementById('fileUpload').addEventListener('change', async e => {
  const onProgress = progress => console.log('Progress:', `${Math.round(progress * 100)}%`);
  const response = await uploadFiles('/api/upload', e.currentTarget.files, onProgress);
  if (response.status >= 400) {
    throw new Error(`File upload failed - Status code: ${response.status}`);
  }
  console.log('Response:', response.body);
}

Also works with the e.dataTransfer.files you get from a drop event when building a file drop zone.

Ricki-BumbleDev
  • 1,698
  • 1
  • 16
  • 18
  • it may not be useful when you want to show progress for a *both* file upload and the response (typical scenario is when uploading a big csv file, and then server does some slow conversation whose progress we want to show as well) – Nir O. Mar 17 '23 at 15:12
17

Update: as the accepted answer says it's impossible now. but the below code handled our problem for sometime. I should add that at least we had to switch to using a library that is based on XMLHttpRequest.

const response = await fetch(url);
const total = Number(response.headers.get('content-length'));

const reader = response.body.getReader();
let bytesReceived = 0;
while (true) {
    const result = await reader.read();
    if (result.done) {
        console.log('Fetch complete');
        break;
    }
    bytesReceived += result.value.length;
    console.log('Received', bytesReceived, 'bytes of data so far');
}

thanks to this link: https://jakearchibald.com/2016/streams-ftw/

Hosseinmp76
  • 317
  • 2
  • 12
  • 10
    Nice, but does it apply to uploads as well? – kernel Feb 04 '19 at 14:22
  • @kernel I tried to find out but I wasn't able to do it. and I like to find a way to do this for upload too. – Hosseinmp76 Jul 08 '19 at 12:53
  • Same same, but so far I wasn't too lucky finding/creating a functioning upload example. – kernel Jul 09 '19 at 11:49
  • 3
    `content-length` !== length of body. When http compression is used (common for big downloads), the content-length is the size after the http compression, while the length is the size after the file has been extracted. – Ferrybig Sep 12 '19 at 08:37
  • 2
    Your code assumes that the content header length specifies the amount of bytes the fetch is going to download. This is not always true, so your code cannot show progress to the user, as `bytesReceived` becomes bigger than `total` – Ferrybig Sep 13 '19 at 10:08
  • 2
    Moreover, not even the browser knows the actual content length beforehand. All you're going to get is a post-compression progress indicator. For example, if you're downloading a zip file with unevenly distributed compression ratio (some files are random, some are low entropy) you'll notice that the progress indicator is severely skewed. – cutsoy Dec 07 '19 at 00:01
  • @Ferrybig `totalBytes = response.headers.get('Content-Encoding') !== 'gzip' ? Number(response.headers.get('Content-Length')) : null;` – yyny Apr 30 '20 at 11:18
11

with fetch: now possible with Chrome >= 105

How to: https://developer.chrome.com/articles/fetch-streaming-requests/

Currently not supported by other browsers (maybe that will be the case when you read this, please edit my answer accordingly)

Feature detection (source)

const supportsRequestStreams = (() => {
  let duplexAccessed = false;

  const hasContentType = new Request('', {
    body: new ReadableStream(),
    method: 'POST',
    get duplex() {
      duplexAccessed = true;
      return 'half';
    },
  }).headers.has('Content-Type');

  return duplexAccessed && !hasContentType;
})();

HTTP >= 2 required

The fetch will be rejected if the connection is HTTP/1.x.

Gabriel
  • 3,633
  • 1
  • 23
  • 13
6

I don't think it's possible. The draft states:

it is currently lacking [in comparison to XHR] when it comes to request progression


(old answer):
The first example in the Fetch API chapter gives some insight on how to :

If you want to receive the body data progressively:

function consume(reader) {
  var total = 0
  return new Promise((resolve, reject) => {
    function pump() {
      reader.read().then(({done, value}) => {
        if (done) {
          resolve()
          return
        }
        total += value.byteLength
        log(`received ${value.byteLength} bytes (${total} bytes in total)`)
        pump()
      }).catch(reject)
    }
    pump()
  })
}

fetch("/music/pk/altes-kamuffel.flac")
  .then(res => consume(res.body.getReader()))
  .then(() => log("consumed the entire body without keeping the whole thing in memory!"))
  .catch(e => log("something went wrong: " + e))

Apart from their use of the Promise constructor antipattern, you can see that response.body is a Stream from which you can read byte by byte using a Reader, and you can fire an event or do whatever you like (e.g. log the progress) for every of them.

However, the Streams spec doesn't appear to be quite finished, and I have no idea whether this already works in any fetch implementation.

Community
  • 1
  • 1
Bergi
  • 630,263
  • 148
  • 957
  • 1,375
  • 22
    If I read that example correctly, though, this would be for **downloading** a file via `fetch`. I'm interested in progress indicators for **uploading** a file. – neezer Feb 29 '16 at 23:52
  • Oops, that quote talks about *receiving* bytes, which confused me. – Bergi Mar 01 '16 at 00:08
  • 1
    @Bergi Note, `Promise` constructor is not necessary. `Response.body.getReader()` returns a `Promise`. See [How to solve Uncaught RangeError when download large size json](http://stackoverflow.com/questions/39959467/how-to-solve-uncaught-rangeerror-when-download-large-size-json) – guest271314 Dec 17 '16 at 23:55
  • 3
    @guest271314 yeah, [I've fixed it at the source](https://github.com/whatwg/fetch/pull/227) of the quote already. And no, [`getReader`](https://streams.spec.whatwg.org/#rs-get-reader) does not return a promise. No idea what this has to do with the post you linked. – Bergi Dec 18 '16 at 13:35
  • @Bergi Yes, you are correct `.getReader()`'s `.read()` method returns a `Promise`. That is what was trying to convey. The link is to allude to the premise that if progress can be checked for download, progress can be checked for upload. Put together a pattern which returns expected result, to an appreciable degree; that is progress for `fetch()` upload. Have not found a way to `echo` a `Blob` or `File` object at jsfiddle, probably missing something simple. Testing at `localhost` uploads file very rapidly, without mimicking network conditions; though just remembered `Network throttling`. – guest271314 Dec 18 '16 at 16:48
4

Since none of the answers solve the problem.

Just for implementation sake, you can detect the upload speed with some small initial chunk of known size and the upload time can be calculated with content-length/upload-speed. You can use this time as estimation.

Magix
  • 4,989
  • 7
  • 26
  • 50
Shishir Arora
  • 5,521
  • 4
  • 30
  • 35
  • 3
    Very clever, nice trick to use while we wait for a realtime solution :) – Magix Aug 12 '17 at 07:20
  • 31
    Too risky for me. Wouldn't want to [end up like the windows copy file progress bar](https://superuser.com/questions/43562/windows-file-copy-dialog-why-is-the-estimation-so-bad) – Jack G Jun 18 '18 at 20:20
  • 1
    Not reliable, complex and will show incorrect values. – zdm Jul 26 '21 at 17:30
0

A possible workaround would be to utilize new Request() constructor then check Request.bodyUsed Boolean attribute

The bodyUsed attribute’s getter must return true if disturbed, and false otherwise.

to determine if stream is distributed

An object implementing the Body mixin is said to be disturbed if body is non-null and its stream is disturbed.

Return the fetch() Promise from within .then() chained to recursive .read() call of a ReadableStream when Request.bodyUsed is equal to true.

Note, the approach does not read the bytes of the Request.body as the bytes are streamed to the endpoint. Also, the upload could complete well before any response is returned in full to the browser.

const [input, progress, label] = [
  document.querySelector("input")
  , document.querySelector("progress")
  , document.querySelector("label")
];

const url = "/path/to/server/";

input.onmousedown = () => {
  label.innerHTML = "";
  progress.value = "0"
};

input.onchange = (event) => {

  const file = event.target.files[0];
  const filename = file.name;
  progress.max = file.size;

  const request = new Request(url, {
    method: "POST",
    body: file,
    cache: "no-store"
  });

  const upload = settings => fetch(settings);

  const uploadProgress = new ReadableStream({
    start(controller) {
        console.log("starting upload, request.bodyUsed:", request.bodyUsed);
        controller.enqueue(request.bodyUsed);
    },
    pull(controller) {
      if (request.bodyUsed) {
        controller.close();
      }
      controller.enqueue(request.bodyUsed);
      console.log("pull, request.bodyUsed:", request.bodyUsed);
    },
    cancel(reason) {
      console.log(reason);
    }
  });

  const [fileUpload, reader] = [
    upload(request)
    .catch(e => {
      reader.cancel();
      throw e
    })
    , uploadProgress.getReader()
  ];

  const processUploadRequest = ({value, done}) => {
    if (value || done) {
      console.log("upload complete, request.bodyUsed:", request.bodyUsed);
      // set `progress.value` to `progress.max` here 
      // if not awaiting server response
      // progress.value = progress.max;
      return reader.closed.then(() => fileUpload);
    }
    console.log("upload progress:", value);
    progress.value = +progress.value + 1;
    return reader.read().then(result => processUploadRequest(result));
  };

  reader.read().then(({value, done}) => processUploadRequest({value,done}))
  .then(response => response.text())
  .then(text => {
    console.log("response:", text);
    progress.value = progress.max;
    input.value = "";
  })
  .catch(err => console.log("upload error:", err));

}
guest271314
  • 1
  • 15
  • 104
  • 177
  • 5
    This achieves absolutely nothing. It is just a very complex syntax for showing progress/spinner and hiding it when request finishes. – Vočko Sep 01 '21 at 06:29
0

There is currently (2023) an NPM package that upgrades fetch, making it quite straightforward to monitor progress. It's called fetch-progress and is available via npmjs. I've found it quite helpful.

Here's the example given in their docs, which illustrates its simplicity:

fetch(this.props.src)
    .then(
      fetchProgress({
        // implement onProgress method
        onProgress(progress) {
          console.log({ progress });
          // A possible progress report you will get
          // {
          //    total: 3333,
          //    transferred: 3333,
          //    speed: 3333,
          //    eta: 33,
          //    percentage: 33
          //    remaining: 3333,
          // }
        },
      })
    )
wolfyuk
  • 746
  • 1
  • 8
  • 26
user2330237
  • 1,629
  • 5
  • 20
  • 39
-2

I fished around for some time about this and just for everyone who may come across this issue too here is my solution:

const form = document.querySelector('form');
const status = document.querySelector('#status');

// When form get's submitted.
form.addEventListener('submit', async function (event) {
    // cancel default behavior (form submit)
    event.preventDefault();

    // Inform user that the upload has began
    status.innerText = 'Uploading..';

    // Create FormData from form
    const formData = new FormData(form);

    // Open request to origin
    const request = await fetch('https://httpbin.org/post', { method: 'POST', body: formData });

    // Get amount of bytes we're about to transmit
    const bytesToUpload = request.headers.get('content-length');

    // Create a reader from the request body
    const reader = request.body.getReader();

    // Cache how much data we already send
    let bytesUploaded = 0;

    // Get first chunk of the request reader
    let chunk = await reader.read();

    // While we have more chunks to go
    while (!chunk.done) {
        // Increase amount of bytes transmitted.
        bytesUploaded += chunk.value.length;

        // Inform user how far we are
        status.innerText = 'Uploading (' + (bytesUploaded / bytesToUpload * 100).toFixed(2) + ')...';

        // Read next chunk
        chunk = await reader.read();
    }
});
  • 3
    I don't think this is doing what you think it is. [fetch returns a Response](https://developer.mozilla.org/en-US/docs/Web/API/fetch#return_value), not a Request. Everywhere you refer to as `request` should actually be named `response`. You're doing everything else correctly, but what you're receiving is a response body and more of a "download" progress than an "upload" progress. – curiouser Aug 10 '22 at 15:09
-3
const req = await fetch('./foo.json');
const total = Number(req.headers.get('content-length'));
let loaded = 0;
for await(const {length} of req.body.getReader()) {
  loaded = += length;
  const progress = ((loaded / total) * 100).toFixed(2); // toFixed(2) means two digits after floating point
  console.log(`${progress}%`); // or yourDiv.textContent = `${progress}%`;
}
Benjamin Gruenbaum
  • 270,886
  • 87
  • 504
  • 504
Leon Gilyadov
  • 670
  • 6
  • 13
-15

Key part is ReadableStream &Lt;obj_response.body&Gt;.

Sample:

let parse=_/*result*/=>{
  console.log(_)
  //...
  return /*cont?*/_.value?true:false
}

fetch('').
then(_=>( a/*!*/=_.body.getReader(), b/*!*/=z=>a.read().then(parse).then(_=>(_?b:z=>z)()), b() ))

You can test running it on a huge page eg https://html.spec.whatwg.org/ and https://html.spec.whatwg.org/print.pdf . CtrlShiftJ and load the code in.

(Tested on Chrome.)

Pacerier
  • 86,231
  • 106
  • 366
  • 634