4

I have an Angular application, which needs to send N XHTTP requests, where 1 <= N <= 10000.

The application needs to handle it as fast as possible, so preferably there should be multiple active XHTTP requests at the same time, with a slide-window of multiple requests at the same time. Using WebSocket, or other streaming-like solution is not possible due to server-side API limitations.

My first idea was to use something like RxJS forkJoin, but I struggle to limit the concurrent requests number. As far as I know, there are API limitations for max requests number, for instance Chrome will allow only 8 simultaneous requests.

Most of the solutions/tutorials I found either a.) does not limit the maximum number of concurrent connections or b.) does not update dynamically (timeout solutions are not efficient for this task).

For instance:

const test = () =>
  request(`https://swapi.co/api/people/1/`)
    .pipe(
      delay(1000),
      switchMap(response => from(response.films)),
      concatMap((url: string) => request(url).pipe(delay(1000))),
      scan((acc, res) => [...acc, res.title], []),
      tap(console.log)
    )
    .subscribe()

is not good for me, as the limitation is achieved by delay, but I would like to achieve something like a thread based solution: there are maximum of Y number of concurrent connections, and if one finishes, a new request starts immediately.

const test = () =>
  request(`https://swapi.co/api/people/1/`)
    .pipe{
      switchMap(response => from(response.films)),
      specialOperatorIAmLookingFor((url: string) => request(url), 8),   // where '8' is the maximum number of paralell requests
      scan((acc, res) => [...acc, res.title], []),
      tap(console.log)
    )
    .subscribe()

Any ideas how to solve this nicely? RxJS feels like there should be a solution for this already written.

Giannis
  • 1,790
  • 1
  • 11
  • 29
ForestG
  • 17,538
  • 14
  • 52
  • 86
  • And what happens when you just straightforward try to send the 1000 requests simultaneously? Doesn't the browser manage them under the hood? – mbojko Jul 13 '20 at 08:08
  • Oh, inetersting! So you suggest that this browser limitation means that the program will be blocked until all requests are sent, instead of failing after reaching the limit? In that case, the browser already manages what I want to achive? – ForestG Jul 13 '20 at 08:24
  • 1
    Not exactly "blocked", from the script's point of view they are just your regular async operations. They all "go out" at once, even if actually only, say, 8 are alive at the same time. – mbojko Jul 13 '20 at 08:35
  • Thank you for the clarification. This means that I don't really need to handle this at all. I would accept this comment, if it would be an answer :) – ForestG Jul 13 '20 at 08:38
  • Hey, you don't need to handle anything extra but instead of using `SwitchMap`, you should use `mergeMap` to make request parallel if you don't want to cancel the previous request. – micronyks Jul 13 '20 at 08:55

1 Answers1

7

You could try to use RxJS bufferCount and concatMap operators along with forkJoin().

From bufferCount docs:

Collect emitted values until provided number is fulfilled, emit as array.

So it collects n number of notifications and emit it as an array. We could then pass the array through forkJoin() for n number of parallel requests.

Try the following

I assume this.urls is a collection of HTTP requests similar to

urls = [
  this.http.get('url1'),
  this.http.get('url2'),
  this.http.get('url3'),
  ...
];

Then the requests triggering code would look like

bufferedRequests() {
  from(this.urls).pipe(
    bufferCount(6),      // <-- adjust number of parallel requests here
    concatMap(buffer => forkJoin(buffer))
  ).subscribe(
    res => console.log(res),
    err => console.log(err),
    () => console.log('complete')
  );
}

According to this comment by a Chromium engineer, actual solution to the max. connections to host/domain limit would be to use WebSockets or Domain sharding. But since you mention it isn't possible in your environment, you could use the buffered request workaround.

However I wouldn't try to buffer to the max limit. If you were to send more requests to the same domain than the max allowed, you could see that the additional requests would actually throttle behind till the above requests are finished. So say if were to buffer to the max allowed limit, and your application sends an additional request to the same domain from somewhere else on which the app workflow depends on, the entire app could throttle.

So it's better to either use WebSockets or Domain sharding. And if neither is possible it's better to buffer the requests to a number of requests less than* the max allowed limit.

* Obviously, if you're 100% percent sure no other requests will be triggered to the same domain during the buffering procedure, then you could buffer to max. allowed limit.

ruth
  • 29,535
  • 4
  • 30
  • 57
  • Would this approach also work where you don't want to overload the browsers request queue, so API calls (unrelated to the buffered calls) can be made while the 1,000 batch is in progress? – Drenai Jun 07 '22 at 22:25
  • @Drenai: AFAIK, there is a hard limit to the number of simultaneous requests from a browser regardless of the domain. I could only find [this post](https://stackoverflow.com/questions/985431/max-parallel-http-connections-in-a-browser) which might most probably be outdated. So even with the buffered queuing mechanism shown above is used, you should mind about the browser's hard limit for max parallel requests. – ruth Jun 08 '22 at 10:07