0

Browser : Chrome Latest (101.x)

I am currently using WebWorker to fetch image file like this.. In the web worker file,


    onmessage = function(oEvent) {
    
       var promises = [];
       for(var i = ..) {
          for(var j = ...) {
              (function(fI,sI) {
                    var p = new Promise((resolve, reject) => {
                                         .....
    
                                    fetch(urls[fI][sI]).then(...
                           });
                    promises.push(p);
              })(i,j)
          }
       }
       Promise.all(promises).then((results) => {
          ....blah..
    
       }
    }

No CORS concern...every urls point to the same domain...

Actually, the urls contains about 4000 image urls..

Let's say...the urls contain like this...for the sake of simplicity...it only contains a - z image files...

Say the server is blah.


    http://blah/..../a.jpg
    http://blah/..../b.jpg
    ...
    http://blah/..../z.jpg

I noticed that some of files are ok..but some of them throws "TypeError: Failed to fetch".... For examples, sometimes, I see a,b,c throws the error..... sometimes, g,h,x .....I don't see any apparent reason why.... I investigated those urls that return the error, but nothing wrong with those urls....They are all valid ones...

So I was thinking that I slammed too many requests...so I used setTimeout to request at different timing..but still it didn't resolve the issues...

Obviously, If I reduced the size of requests, that everything seems ok...

So are there any limits that I could make fetch requests under Web Worker? There is no way that I could make this work? I don't really care about performance for this..as long as I could fetch all of 4000 successfully...I should be fine...

Any ideas?

Thanks,

Updated: Here is the captured image..

Developer Tool Images for Debugging

I just captured the part of requests from the Developer Tool. fetching.js:45 ... it points to the fetch(urls[fI][sI]).then(... this line..

futurebaby
  • 1
  • 1
  • 3
  • There will be no benefit and perhaps harm in trying to request ALL your images in parallel. At best the browser will just queue most of the requests and only allow a few to be sent to the server at a time. At worst, the server will barf on how many simultaneous requests are all being sent from the same client. I'd suggest using something like [mapConcurrent()](https://stackoverflow.com/questions/46654265/promise-all-consumes-all-my-ram/46654592#46654592) which will iterate the entire array and get all response, but send only N requests at a time. – jfriend00 Apr 29 '22 at 16:27
  • 1
    To know exactly why you're getting a failure, we'd need to know more about the specific failure (precise details of the error) you're getting and whether it's client-side (perhaps memory or sockets) or whether you're getting an error back from the server (in which case the server is objecting). In the worst case, you may have to slow down the requests because of server-side rate limiting. – jfriend00 Apr 29 '22 at 16:29
  • Shouldn't I see "Pending" requests when those requests that couldn't be processed by the time? I have no idea why it just went to "Failed".... – futurebaby Apr 29 '22 at 16:32
  • I was trying to look at the details of the error,..but the error didn't contain any detailed message...it just says "TypeError: Failed to fetch"... Like I said, I tried to slow down the requests as well (using setTimeout)...but no luck though. – futurebaby Apr 29 '22 at 16:34
  • There should be more details somewhere than "failed to fetch". There should at least be a status code and there should be even more info than that in the debug console (network tab). You need to log all possible errors from your fetch call. And, depending upon how you implemented the `setTimeout()`, that may or may not have really helped anything. Ultimately, you need to control how many requests are in flight at the same time which cannot be done purely with `setTimeout()`. – jfriend00 Apr 29 '22 at 16:40
  • 1
    FYI, your current code does not even have a `.catch()` on your `fetch()` call so you're not even paying attention to errors. And, you're using a promise anti-pattern by wrapping `fetch()` in a manual promise. `fetch()` already returns a promise, you don't need to wrap it in another one. This needs basic debugging where you log all possible errors and that gives you hints as to what is going wrong. – jfriend00 Apr 29 '22 at 16:42
  • ok..there is no status code at all...It just "Failed"..Please look at the pic I just added. – futurebaby Apr 29 '22 at 16:48
  • ok..actually, there is catch() in my actual code... the error "failed to fetch" is one that I got from the catch clause. – futurebaby Apr 29 '22 at 16:49
  • One solution I could think of is possibly I could separate the urls array to the small buckets...so maybe like 500 * 8..which 8 buckets ....to request...I guess this should work..but still I am curious ...why I don't see pending status, but went to "Failed" status right away though. – futurebaby Apr 29 '22 at 16:56
  • I've already linked you in my first comment to code that works for problems like this. It's more sophisticated and accurate than chunking into 500 at a time. I'd really recommend you use code like that as it's the elegant way to solve these types of problems. – jfriend00 Apr 29 '22 at 16:57
  • If it's failing right away, then perhaps the browser is enforcing some limit on concurrency due to resource levels. That limit would not necessarily be the same from one browser to the next so there's no point in trying to find out exactly what it is. If you use the above referenced `mapConcurrent()` and set the concurrency level to something like 5, it should make any client-side issues go away. – jfriend00 Apr 29 '22 at 16:58
  • who controls the server, if its not yours they might have added rate limit and slow down features. – bogdanoff Apr 29 '22 at 16:59
  • Ok...I saw that...yes..yours one could be better approach...however, still my question remains..so is this a sort of limit for the fetch API? ...ok..let'say you requested 10,000 requests..then...isn't that supposed to be "Pending" for those requests that couldn't be processed by the time?..then..when it can, then isn't that supposed to process successfully? – futurebaby Apr 29 '22 at 17:01
  • the server is mine... – futurebaby Apr 29 '22 at 17:02
  • then as jfriend00 said, browser is enforcing limit to prevent high cpu usage. I think this limit is not just for fetch API, its for entire network related stuff (AJAX) api. – bogdanoff Apr 29 '22 at 17:17

0 Answers0