3

I'm implementing a web server using nodejs which must serve a lot of concurrent requests. As nodejs processes the requests one by one, it keeps them in an internal queue (in libuv, I guess).

I also want to run my web server using cluster module, so there will be one requests queue per worker.

Questions:

  1. If any worker dies, how can I retrieve its queued requests?
  2. How can I put retrieved requests into other workers' queues?
  3. Is there any API to access to alive workers' requests queue?

By No. 3 I want to keep queued requests somewhere such as Redis (if possible), so in case of server crash, failure or even hardware restart I can retrieve them.

Majid Yaghouti
  • 907
  • 12
  • 25

1 Answers1

0

As you mentioned in the tags that you are-already-using/want-to-use redis, you can use queue-manager based on redis to do all the work for you.

Checkout https://github.com/OptimalBits/bull (or it's alternatives).

bull has a concept of queue. you add jobs to the queue and listen to the same queue from different processes/vms. bull will send the same job to only one listener and you have the ability to control how many jobs each listener is processing at the same time (concurrency-level).

In addition, if one of the jobs fails to run (in other words, the listener of the queue threw an error), bull will try to give the same job to different listener.

Stav Alfi
  • 13,139
  • 23
  • 99
  • 171