1

So I was trying to run multiple fetch requests in parallel like this:

    fetch(
        `http://localhost/test-paralell-loading.php?r`
    );

    fetch(
        `http://localhost/test-paralell-loading.php?r`
    );

    fetch(
        `http://localhost/test-paralell-loading.php?r`
    );

But they are unexpectedly running in sequence:

fetch request running sequentially

Is this an HTTP 1.1 limitation? How can I overcome it?

Update:

seems like that is on chrome, o firefox it behaves differently: enter image description here

Is there something that can be made to improve the peformance on chrome browsers?

useless
  • 1,876
  • 17
  • 18
  • Does [this](https://stackoverflow.com/a/59586421/11768882) answer your question? – Arturo Mendes Mar 04 '23 at 12:00
  • @ArturoMendes The answer in your link states that `Promise.all will execute neither in parallel nor sequentially but concurrently`, whereas in this question the requests are being sent sequentially. – Fractalism Mar 04 '23 at 12:03
  • Does the server respond to multiple requests at the same time? I think maybe the grey section in the image is connection started which (I think) might mean that the fetch has started but the server hasn't yet started to respond. – Ben Stephens Mar 04 '23 at 12:09
  • @Fractalism that answer has many resources that explain the topic better than I can. Plus when executing JavaScript on the browser you depend on how many resources does the environment want to dedicate to your process – Arturo Mendes Mar 04 '23 at 12:15
  • @BenStephens it does respond multiple requests at the same time, see update results for firefox. – useless Mar 04 '23 at 12:32
  • 1
    @ArturoMendes that is not it. fetch requests run in background and are not limited by the calls of an `await`, also I updated the details and firefox seems to behave more in the way I would expected. – useless Mar 04 '23 at 12:34
  • Looks like requests are in the **stalled** phase of the Connection Start. This could be for any of the reasons under **queueing**. See this https://developer.chrome.com/docs/devtools/network/reference/#timing-preview – morganney Mar 04 '23 at 13:57
  • Show the full waterfall and see how many TCP connections are active to localhost. Assuming your server is using http 1.1. – morganney Mar 04 '23 at 13:59
  • 2
    Making three GET requests for exactly the same resource seems wasteful. My guess is that the browser is queuing them in the hope that the response can contain a cache header so it doesn't have to repeat the request for a second and third time. – Bergi Mar 04 '23 at 15:37
  • Related to Bergi's comment, could you try adding stuff to the query string of the requests. e.g. `http://localhost/test-paralell-loading.php?r=1`, `http://localhost/test-paralell-loading.php?r=2` etc.? – Ben Stephens Mar 04 '23 at 17:50
  • . Bergi was right!!! Please post as answer and I'll accept it as solution. – useless Mar 04 '23 at 19:05

2 Answers2

-1

Your request are running concurrently, it is just that some of them are in the stalled phase of the Connection Start. This could be because of any of the reasons under queueing. See previewing a timing breakdown.

  • Queueing. The browser queues requests when:
    • There are higher priority requests.
    • There are already six TCP connections open for this origin, which is the limit. Applies to HTTP/1.0 and HTTP/1.1 only.
    • The browser is briefly allocating space in the disk cache.

You can see the difference between running concurrently or in series by looking at the waterfall in your browser network tab.

const endpoints = [
  'https://swapi.dev/api/people/1/',
  'https://swapi.dev/api/people/2/',
  'https://swapi.dev/api/people/3/'
]
function requestConcurrently() {
  endpoints.forEach(endpoint => fetch(endpoint))
}

requestConcurrently()

Waterfall for concurrent requests:

enter image description here

const endpoints = [
  'https://swapi.dev/api/people/1/',
  'https://swapi.dev/api/people/1/',
  'https://swapi.dev/api/people/1/'
]
function requestConcurrently() {
  endpoints.forEach(endpoint => fetch(endpoint))
}

requestConcurrently()

(Using the same URL): enter image description here

Now in series example:

const endpoints = [
  'https://swapi.dev/api/people/1/',
  'https://swapi.dev/api/people/2/',
  'https://swapi.dev/api/people/3/'
]
async function requestSeries() {
  for (const endpoint of endpoints) {
    await fetch(endpoint)
  }
}

requestSeries()

Waterfall for in series requests: enter image description here

morganney
  • 6,566
  • 1
  • 24
  • 35
  • I dont think that is the reason. It is related with HTTP 1 for sure, because your requestSeries works only for HTTP2 servers. trying to do that with an http 1 server will just do as I described. However I cant figure out why if there is a limit of connections > 1 it would behave like that. I believe that chrome has a limit of 2, and firefox of 6. – useless Mar 04 '23 at 18:53
  • I think you have a misunderstanding or misconfigured server. – morganney Mar 05 '23 at 20:13
  • it wasn't that, Bergi nailed the reason why it was behaving like that. there was no error in the javascript nor the server. – useless Mar 05 '23 at 20:24
  • It behaves the same whether you request the same URL or not. Also, did you not see the bullet item **The browser is briefly allocating space in the disk cache.** – morganney Mar 05 '23 at 22:16
  • it behaves the same with the same url? no it doesn't. https://imgur.com/OmSM8K0 not at least on latest chrome 110.0.5481.177 Also waiting for a response of a request to see if it CAN be cached is quite different to allocating space to cache. – useless Mar 10 '23 at 18:22
  • @useless your requests are firing concurrently, you can see that in the waterfall. Have you tried running the series code? I also attached a new image of running them concurrently to the same URL. You also have not made any real distinction between the reasons for the stalling phase of some of the concurrent connections. How do you know **waiting for a response of a request to see if it CAN be cached is quite different to allocating space to cache**? There is no way you can. – morganney Mar 11 '23 at 00:45
-1

@bergi answer here the reason why Chrome was behaving like that:

The browser is queuing them in the hope that the response can contain a cache header so it doesn't have to repeat the request

useless
  • 1,876
  • 17
  • 18