1

I am using Node js for creating a REST API.

In my scenario i have two API's.

API 1 --> Have to get 10,000 records and make a iteration to modify some of the data

API 2: Simple get method.

When i open post man and hit the first API and Second API parallel Because of Node JS is single threaded Which Causes second API slower for getting response.

My Expectation: Even though the 1st API getting time it should not make the 2nd API for large time.

From Node JS docs i have found the clustering concept. https://nodejs.org/dist/latest-v6.x/docs/api/cluster.html So i implemented Cluster it created 4 server instance. Now i hit the API 1 in one tab and API 2 in second tab it worked fine. But when i opened API 1 in 4 tabs and 5th tab again API 2 which causes the slowness again.

What will be the best solution to solve the issue?

Rocket55
  • 157
  • 1
  • 9
  • Please give an example where you need to iterate over 10,000 records and modify the data. I'm can't think of any example scenario. Are you using mongodb? postgre? – borislemke Dec 11 '16 at 17:30
  • Not any using database. Depends on two external service. After getting data from the external service need to modify the data. – Rocket55 Dec 11 '16 at 17:40
  • But how is it possible that you need to update 10,000+ data in a single operation? I still do not quite get a real-world example on when to do this. Give us a real example pls. – borislemke Dec 11 '16 at 19:29

2 Answers2

1

Because of the single threaded nature of node.js, the only way to make sure your server is always responsive to quick requests such as you describe for API2 is to make sure that you never have any long running operations in your server.

When you do encounter some operation in your code that takes awhile to run and would affect the responsiveness of your server, your options are as follows:

  1. Move the long running work to a new process. Start up a new process and run the length operation in another process. This allows your server process to stay active and responsive to other requests, even while the long running other process is still crunching on its data.

  2. Start up enough clusters. Using the clustering you've investigated, start up more clusters than you expect to have simultaneous calls to your long run process. This allows there to always be at least one clustered process that is available to be responsive. Sometimes, you cannot predict how many this will be or it will be more than you can practically create.

  3. Redesign your long running process to execute its work in chunks, returning control to the system between chunks so that node.js can interleave other work it is trying to do with the long running work. Here's an example of processing a large array in chunks. That post was written for the browser, but the concept of not blocking the event loop for too long is the same in node.js.

  4. Speed up the long running task. Find a way to speed up the long running job so it doesn't take so long (using caching, not returning so many results at once, faster way to do it, etc...).

  5. Create N worker processes (probably one less worker process than the number of CPUs you have) and create a work queue for the long running tasks. Then, when a long running request comes in, you insert it in the work queue. Then, each worker process is free to work on items in the queue. When more than N long tasks are being requested, the first ones will get worked on immediately, the later ones will wait in the queue until there is a worker process available to work on them. But, most importantly, your main node.js process will stay free and responsive for regular requests.

This last option is the most foolproof because it will be effective to any number of long running requests, though all of the schemes can help you.

Community
  • 1
  • 1
jfriend00
  • 683,504
  • 96
  • 985
  • 979
0

Node.js actually is not multi-threaded, so all of these requests are just being handled in the event loop of a single thread.

Each Node.js process runs in a single thread and by default it has a memory limit of 512MB on 32 bit systems and 1GB on 64 bit systems.

However, you can split a single process into multiple processes or workers. This can be achieved through a cluster module. The cluster module allows you to create child processes (workers), which share (or not) all the server ports with the main Node process.

You can invoke the Cluster API directly in your app, or you can use one of many abstractions over the API

https://nodejs.org/api/cluster.html

Andrés Andrade
  • 2,213
  • 2
  • 18
  • 23