20

When using the fetch-API in the most simple way, Chrome is not garbage collecting correctly. Am I doing something wrong?

for (i = 0; i < 100; i++) {
  fetch('https://upload.wikimedia.org/wikipedia/commons/3/3d/LARGE_elevation.jpg')
    .then(response => {
      console.log('Memory-bloating')
    })
}

https://jsfiddle.net/dozrpcvj/12/

This JSFiddle fills the memory with 1.4GB, which doesn't release until you either manually garbage collect or you close the tab. If you increase the number of iterations to 1000 it "downloads" 14GB (from the own disk) and instead of garbage collecting it starts filling the swap file on disk.

Am I doing something wrong or is this a bug in Chrome? When testing with Safari, it also fills the hard drive with 1.4GB, but starts garbage collecting as soon as it's done.

PS. You can't use the memory profiler, since that tells you that you only use a few MB of data, even if the Activity Monitor or Chromes own Task Manager says 1.4GB.

VLAZ
  • 26,331
  • 9
  • 49
  • 67
user2687506
  • 789
  • 6
  • 21
  • 2
    Interesting note - I can't seem to find that memory in the dev tools. Taking a memory snapshot reports ~10 meg, looking at the performance tab, it doesn't report abnormal memory usage - from ~8 meg to ~16 at most, during load. Task manager does show more than 1.5 gig of memory used and I have a single Chrome tab open. Hitting record on the performance tab and then following up with garbage collection frees up about a meg of JS heap memory (11MB -> 10MB) but task manager reports at least a gig of memory released. – VLAZ Oct 08 '18 at 06:46
  • @vlaz Yes, that is what I've noted too. I've reported it to Google through Chrome, but I wanted to know if I were the one using fetch the wrong way, or Chrome handling fetch the wrong way. – user2687506 Oct 08 '18 at 06:58
  • For what it's worth, I get similar results in Firefox. Sometimes. The JSFiddle blows up my memory usage from 1.2GB to ~3.3GB but then it cleans up. However, executing the same code in the console hits around 2.7GB and lingers there for a while more before it cleans up after itself. It appears as if the GC sometimes kicks in later. I seem to have a similar result with Chrome on occasions: open new tab -> run the code in the console. The *first* time I do it, it reaches ~1.6GB and then goes back to ~250MB. The *second* time I do it in the same tab (no reloading), the memory stays high. – VLAZ Oct 08 '18 at 07:09
  • Have you tried calling `response.blob()` inside the `then` block to see if that will trigger the GC sooner? – robertklep Oct 08 '18 at 08:17
  • @robertklep yes, but we have unfortunately not seen any changes in the GC.. – user2687506 Oct 08 '18 at 08:52
  • @vlaz oh, interesting. So it might be some "smart" caching, implemented horribly wrong? (working correctly the first time, and not the second) – user2687506 Oct 08 '18 at 08:55
  • It sort of seems like cacheing. Although I'm not entirely sure it works. It might be putting all the promise objects somewhere in case you want to re-fetch them during the lifetime of the page. Although, I'd say it's also wrong, as you can clearly see it takes up a lot more memory than it probably should. The weird thing is that calling GC manually clears whatever is in memory. So, it could be that automatic GC (purposefully?) skips these objects while manual doesn't or otherwise problem with GC itself. – VLAZ Oct 08 '18 at 09:05
  • I think Fetch uses HTTP-cache, likely the increase in memory is just the cache increasing. If so, the cache should probably clear when the browser dictates the cache is growing full. – Mystogan Oct 08 '18 at 09:10
  • @Mystogan you might be right, but the problem doesn't exist with XMLHttpRequest and we're not fetching any new items from HTTP. The loop fetch the image once, and then just get it from the HTTP-cache. If it's getting the image from the cache AND putting it into the cache, it's probably not a correct implementation of the HTTP-cache. – user2687506 Oct 08 '18 at 09:14
  • 1
    I've tried many things including setting null to variables, resizing arrays and triggering GC manually. In the end, I've settled with refreshing the page. That immediately brings down the memory usage, but introduces a lag until I can execute more javascript in the browser. – nurettin Nov 27 '18 at 14:34
  • Did you ever find a solution for this? – Dustin Kerstein May 14 '20 at 16:40
  • No, and Google never came back with any information on the issue. Is this still an issue? – user2687506 May 20 '20 at 08:25
  • I am seeing this too, using `fetch` for a massive download (70000+ files) in Cordova. Is there a reference to somewhere we can all tell Google that this is a problem, or can we coordinate something? – beruic Jun 29 '20 at 12:18
  • @beruic I've not getting any information from Google and solved my issues in other ways. If you want to, please find the appropriate channels to let them know! :) – user2687506 Jul 07 '20 at 19:32
  • Apart from as an exercise in hammering fetch, is there ever a use case for a tight loop hitting off 100 fetch? I would personally use a function and call it using setTimeout in the done part using a counter: `let cnt = 0; const getUrl = () => { if (cnt>=100) return; fetch(....).then(response => {....; setTimeout(getUrl,500)})}; getUrl();` Does that give memory issues too? – mplungjan Feb 06 '21 at 07:36
  • I'm not getting any memory issue....Also would never use a for loop for this. I would async await this recursively. I wanted to put in my recursion implementation as an answer but couldnt reproduce your memory bloat. Im in chrome on windows. Here try this ```(async function apiReq(i){ if (i > 0){ const res = await fetch('https://upload.wikimedia.org/wikipedia/commons/3/3d/LARGE_elevation.jpg') if (res){ apiReq(--i) console.log(i) } } return })(100) ``` – Michael Paccione Apr 07 '21 at 04:17
  • 1
    Chrome is doing everything as it should and you are doing nothing wrong. The allocation of the memory is by design. Chrome automatical collects garbarge in intervals and the memory will be free'd again. Either by you (if you press the recycle bin inside inspector) or [automatically in ("unknown") intervals](https://groups.google.com/a/chromium.org/g/chromium-discuss/c/eiO7Qxfyefk?pli=1). If you would store every response-reference in (for example) `window["fetch-n"]`, the memory allocation would remain after garbarge collection. – Christopher May 14 '21 at 00:30

3 Answers3

1

You're doing nothing wrong, as that is the correct way to call fetch:

https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch

It's not a bug in Chrome either. It behaves as expected: For each fetch it creates an object that will update during all the promise completion process. All http headers (sent, received), the actual image data, you name it. You're calling it 100 times, so its 14 Mb per promise. Eventually the promises will complete and the garbage collector will free the memory used. I've checked the fiddle and it takes some time for this to happen (minutes), but eventually the memory will be released.

Maybe if you explain why and how you need to launch 100 http calls at the same time, there can be a way to do it so the browser doesn't reserve 1.4 GB of RAM. "Promise.all" or "Promise.race" may be more efficient with the memory use.

  • We used this in React to realise when we should throw a "There is a new version"-prompt to our user. We did one request every 5 minutes, and after a few weeks with no new version the tab that was open took a few gigs of memory. Maybe it is fixed now, but back then it never garbage collected. – user2687506 Oct 29 '21 at 06:22
1

You're doing nothing wrong; it's just that when you call a bunch of promises at the same time, that's going to use a lot of memory. If you want to make sure that it doesn't use as much memory--and you're willing to have it take longer--you could put this all inside of an async function and do this:

for (i = 0; i < 100; i++) {
  await fetch('https://upload.wikimedia.org/wikipedia/commons/3/3d/LARGE_elevation.jpg')
  console.log('Memory-bloating')
}
-2
fetch('https://upload.wikimedia.org/wikipedia/commons/3/3d/LARGE_elevation.jpg')
    .then(response => response.json())
    .then(data => console.log(data))
Sven Eberth
  • 3,057
  • 12
  • 24
  • 29