76

We have a node.js server which implements a REST API as a proxy to a central server which has a slightly different, and unfortunately asymmetric REST API.

Our client, which runs in various browsers, asks the node server to get the tasks from the central server. The node server gets a list of all the task ids from the central one and returns them to the client. The client then makes two REST API calls per id through the proxy.

As far as I can tell, this stuff is all done asynchronously. In the console log, it looks like this when I start the client:

Requested GET URL under /api/v1/tasks/*: /api/v1/tasks/

This takes a couple seconds to get the list from the central server. As soon as it gets the response, the server barfs this out very quickly:

Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/438
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/438
Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/439
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/439
Requested GET URL under /api/v1/tasks/id/:id :/api/v1/tasks/id/441
Requested GET URL under /api/v1/workflow/id/:id :/api/v1/workflow/id/441

Then, each time a pair of these requests gets a result from the central server, another two lines is barfed out very quickly.

So it seems our node.js server is only willing to have six requests out at a time.

Almo
  • 15,538
  • 13
  • 67
  • 95
  • 2
    Node.js as no such thing as a request limitation. This could come from buffering output or many other things. – 3on Aug 21 '12 at 18:31
  • It would help to see your Node server's code -- otherwise, it's hard to do anything more than guess what your problem is. – josh3736 Aug 21 '12 at 18:34
  • It's 1,200 lines, so I'm not sure what the relevant parts to post would be. I was hoping it might be some obvious noob mistake since we're new to working with node.js – Almo Aug 21 '12 at 18:36
  • 1
    @Almo, Remove chunks of code until you narrow it down. – Brad Aug 21 '12 at 18:46
  • Isn't this due to limits on the browser? (Ie. mapservers have abc subdomains, due to limited concurrent requests allowed from browsers?) – knutole Oct 24 '14 at 18:37
  • @knutole look at the accepted answer. :) – Almo Oct 24 '14 at 18:39
  • It is not a NodeJS limit, it is the browser limiting to a max of 6 concurrent network calls simultaneously. Here the limit per browser http://sgdev-blog.blogspot.it/2014/01/maximum-concurrent-connection-to-same.html – ivan Jun 01 '17 at 16:26

6 Answers6

168

There are no TCP connection limits imposed by Node itself. (The whole point is that it's highly concurrent and can handle thousands of simultaneous connections.) Your OS may limit TCP connections.

It's more likely that you're either hitting some kind of limitation of your backend server, or you're hitting the builtin HTTP library's connection limit, but it's hard to say without more details about that server or your Node implementation.

Node's built-in HTTP library (and obviously any libraries built on top of it, which are most) maintains a connection pool (via the Agent class) so that it can utilize HTTP keep-alives. This helps increase performance when you're running many requests to the same server: rather than opening a TCP connection, making a HTTP request, getting a response, closing the TCP connection, and repeating; new requests can be issued on reused TCP connections.

In node 0.10 and earlier, the HTTP Agent will only open 5 simultaneous connections to a single host by default. You can change this easily: (assuming you've required the HTTP module as http)

http.globalAgent.maxSockets = 20; // or whatever

node 0.12 sets the default maxSockets to Infinity.

You may want to keep some kind of connection limit in place. You don't want to completely overwhelm your backend server with hundreds of HTTP requests under a second – performance will most likely be worse than if you just let the Agent's connection pool do its thing, throttling requests so as to not overload your server. Your best bet will be to run some experiments to see what the optimal number of concurrent requests is in your situation.

However, if you really don't want connection pooling, you can simply bypass the pool entirely – sent agent to false in the request options:

http.get({host:'localhost', port:80, path:'/', agent:false}, callback);

In this case, there will be absolutely no limit on concurrent HTTP requests.

josh3736
  • 139,160
  • 33
  • 216
  • 263
  • 2
    This is true for incoming connections, but not outgoing. I suspect he's hitting the max number of sockets open on a given `http.Agent` to a given host. – Michelle Tilley Aug 21 '12 at 20:56
  • @BrandonTilley: It is true that there are no limits for outgoing *TCP connections*, which is what I was thinking of when I first wrote this. You're right that the default HTTP library limits simultaneous connections per-host; I've updated the answer. – josh3736 Aug 21 '12 at 21:34
  • Apologies; I misunderstood the server structure. Question edited. – Almo Aug 22 '12 at 12:23
  • Yeah I just got bit by the problem, local machine is running Node 0.12.2 while the server is running 0.10.38. – Michael Shopsin Jul 09 '15 at 15:44
15

It's the limit on number of concurrent connections in the browser:

How many concurrent AJAX (XmlHttpRequest) requests are allowed in popular browsers?

I have upvoted the other answers, as they helped me diagnose the problem. The clue was that node's socket limit was 5, and I was getting 6 at a time. 6 is the limit in Chrome, which is what I was using to test the server.

Community
  • 1
  • 1
Almo
  • 15,538
  • 13
  • 67
  • 95
8

How are you getting data from the central server? "Node does not limit connections" is not entirely accurate when making HTTP requests with the http module. Client requests made in this way use the http.globalAgent instance of http.Agent, and each http.Agent has a setting called maxSockets which determines how many sockets the agent can have open to any given host; this defaults to 5.

So, if you're using http.request or http.get (or a library that relies on those methods) to get data from your central server, you might try changing the value of http.globalAgent.maxSockets (or modify that setting on whatever instance of http.Agent you're using).

See:

Michelle Tilley
  • 157,729
  • 40
  • 374
  • 311
  • This looks right to me. But changing the maxSockets in https.globalAgent isn't changing the behavior. I've verified that this is the agent doing the work, as its `sockets` array shows the pending requests. I'll keep looking through the code; but I'm pretty sure now that this is the issue. – Almo Aug 21 '12 at 21:31
4

Node js can handle thousands of incoming requests - yes!

But when it comes down to ougoing requests every request has to deal with a dns lookup and dns lookup's, disk reads etc are handled by the libuv which is programmed in C++. The default value of threads for each node process is 4x threads.

If all 4x threads are busy with https requests ( dns lookup's ) other requests will be queued. That is why no matter how brilliant your code might be : you sometimes get 6 or sometimes less concurrent outgoing requests per second completed.

Learn about dns cache to reduce the amount of dns look up's and increase libuv size. If you use PM2 to manage your node processes they do have a well documentation on their side on environment variables and how to inject them. What you are looking for is the environment variable UV_THREADPOOL_SIZE = 4

You can set the value anywhere between 1 or max limit of 1024. But keep in mind libuv limit of 1024 is across all event loops.

Dharman
  • 30,962
  • 25
  • 85
  • 135
NodejsToGo
  • 61
  • 2
2

I have seen the same problem in my server. It was only processing 4 requests. As explained already from 0.12 maxsockets defaults to infinity. That easily overwhelms the sever. Limiting the requests to say 10 by

http.globalAgent.maxSockets = 20;

solved my problem.

1

Are you sure it just returns the results to the client? Node processes everything in one thread. So if you do some fancy response parsing or anything else which doesn't yield, then it would block all your requests.

disjunction
  • 646
  • 5
  • 8
  • There is a little response parsing, but the delay between the first six and the later responses is a few seconds. I'm sure we're not doing THAT much response processing. – Almo Aug 21 '12 at 18:59
  • Thanks, it was my case. – mimic Jun 28 '17 at 21:05