I am new to NodeJS, and would like to know what is the best pattern to use to achieve the following.
An incoming REST API call requests a batch operation (say 100 items) to be performed.
Express handler then has to make 100 external async API calls, which can take upto 120 seconds to return, but many minutes to complete.
The handler just returns with an ACK, and then queues up the requests for processing. (Say in a database or memory)
I then need to process each request by making external API calls in parallell :-
(A) Each API call will return the number of "free resources" available (after the current API call is serviced) for additional parallel calls.
(B) However, these API calls need to be made in parallel, and should not be attempted if the number of "free resources" on the backend is less than a defined threshold.
The idea is simple :- The application should not make too many parallel API calls. It should attempt to leave at least say five backend resources free.
However as the number of free resources is only available after the first API call is made, its OK to go over the limit initially, and then scale back later.
In Java I would do this by:
For the incoming batch populate queue with 100 request objects. (after storing state in DB)
Having a fixed number of worker threads servicing the queue.
Each worker thread would check a global object which would maintain count of free resources.
If free resources are greater then threshold, make the API call, and update the global counter. (Also update DB to indicate API call made for request batch item. The item would be re-submitted to the queue if the API call failed for any reason)
I am not too concerned with optimising the number of threads running or initial spikes etc, as long as there is a decent attempt to not flood the backend with too many requests.
Any suggestions for any async / thread pool framework I can use?