1

I am encountering some problem of performance with an Angular application that is querying a backend via API endpoints.

It exposes an endpoint that I need to call a huge amount of time (>2000) as fast as possible.

Thus, I am iterating over a Set and for each item call a service which is calling an HttpClient get method. It looks aproximately like this :

this.itemList.forEach((item: Item) => {
    this.itemWsService
      .getItemComputation(item)
      .subscribe(// callback method);
});

The problem is, I am not getting the performances I wish I get.

In order to understand what was slowing the application performances down, I measured :

  • date of execution of HttpClient get method call
  • date of Http get request reception on the backend (using logback-access)

The result (shown in the image below) is that I am receiving requests on backend many seconds after I execute the method within Angular.

Chart

My question is : Is Angular kind of waiting before firing the execution of get method call ? Might there be some bottleneck preventing http requests to be made in parallel ? How to avoid this ?


I am running the Angular frontend and the Java+Spring backend on the same machine and embedding the Angular into a native app using Electron.

EnzoMolion
  • 949
  • 8
  • 25

2 Answers2

2

Is Angular kind of waiting before firing the execution of get method call

No

Might there be some bottleneck preventing http requests to be made in parallel ?

Yes:

  • browsers don't send more than a few concurrent requests to a given host (around 6, IIRC). See Max parallel http connections in a browser?
  • if you're using Spring MVC, it has a max number of threads in its pool for request handling, so requests will be queued on the server, too.

How to avoid this ?

You can probably tweak your browser settings to increase the limit. But the correct thing to do would be to redesign the API to avoid having to send 2000 concurrent requests in the first place.

JB Nizet
  • 678,734
  • 91
  • 1,224
  • 1,255
  • I marked this answer as accepted : the browser limitation seems to be the actual explanation to this behavior (the 6 first requests are nearly simultaneous, as shown on the graph). I agree with the bad design of such solution, it was designed this way a long time ago, in a rush. I'll most certainly redesign the endpoint to receive a list of items, process it and send the result asynchronously using WebSockets. Thanks for your answer ! – EnzoMolion May 02 '19 at 13:26
1

The browser imposes a limit on concurrent request number.

From a application design point of view, if you need to create thousands of requests at once, there's something wrong. For example, you could create a single request for all the items and receive all the data in a single response.

I don't know much about Spring, but I guess it provides a way how to get all data at once.

kvetis
  • 6,682
  • 1
  • 28
  • 48
  • 1
    I already started to implement such a solution, but it needs to send computation results as soon as they are available. As said to JB Nizet : "I'll most certainly redesign the endpoint to receive a list of items, process it and send the result asynchronously using WebSockets." Thanks for your answer too. – EnzoMolion May 02 '19 at 13:28