8

I've created a node.js script, that scans network for available HTTP pages, so there is a lot of connections i want to run in parallel, but it seems that some of the requests wait for previous to complete.

Following is the code fragment:

    var reply = { };
    reply.started = new Date().getTime();
    var req = http.request(options, function(res) {
        reply.status = res.statusCode;
        reply.rawHeaders = res.headers;
        reply.headers = JSON.stringify(res.headers);
        reply.body = '';
        res.setEncoding('utf8');
        res.on('data', function (chunk) {
            reply.body += chunk;
        });
        res.on('end', function () {
            reply.finished = new Date().getTime();
            reply.time = reply.finished - reply.started;
            callback(reply);
        });
    });
    req.on('error', function(e) {
        if(e.message == 'socket hang up') {
            return;
        }
        errCallback(e.message);
    });
    req.end();

This code performs only 10-20 requests per second, but i need 500-1k requests performance. Every queued request is made to a different HTTP server.

I've tried to do something like that, but it didn't help:

    http.globalAgent.maxSockets = 500;
Bilesh Ganguly
  • 3,792
  • 3
  • 36
  • 58
druidvav
  • 129
  • 1
  • 1
  • 7
  • 2
    It looks like you're making HTTP requests. Is it even possible to get that many requests that fast over an internet connection? I have an extremely fast connection here, but my ping to the nearest server is about 52ms, which I think means that I could make about 20 HTTP requests per second. – Robert Harvey Jun 28 '13 at 19:45
  • i am running this script on a machine, that iam sure can handle this lot of requests. to be precise: it is hetzner 6s server. – druidvav Jun 28 '13 at 19:48
  • Sure, but did you read what I said? I don't think you can *make* that many requests over HTTP with a single internet connection, no matter how powerful your machine is. When you make an HTTP request, you have to wait for a response from the other end. You can certainly service more requests than that, but that's because you would be servicing requests from many browsers, each with their own internet connection. – Robert Harvey Jun 28 '13 at 19:49
  • Here is output of popular http-server testing tool: # ab -n 10000 -c 1000 http://srv2.itrack.ru/ / Requests per second: 914.94 [#/sec] (mean) Time per request: 1092.968 [ms] (mean) – druidvav Jun 28 '13 at 19:54
  • So you're queuing up requests in node.js then? Waiting for the responses? You'd have to be, and since it takes 1 second to process each request, you'd need 914 live threads in node.js to make it work. – Robert Harvey Jun 28 '13 at 19:55
  • Possibly related: http://stackoverflow.com/q/12060869 – Robert Harvey Jun 28 '13 at 19:56
  • yep, i'm initiating 500 requests in the same time (judging by reply.started) and receiving 10-20 results/sec. this is really slow, but reading docs for several hours hadn't led me to any resolution. – druidvav Jun 28 '13 at 19:58
  • Data payload is very small, some queued requests are waiting for 30-40 seconds, but direct request lasts less than a second. – druidvav Jun 28 '13 at 20:01
  • Where in your code are you launching the requests in parallel? – slebetman Aug 17 '16 at 07:48

2 Answers2

11

Something else must be going on with your code. Node can comfortably handle 1k+ requests per second.

I tested with the following simple code:

var http = require('http');

var results = [];
var j=0;

// Make 1000 parallel requests:
for (i=0;i<1000;i++) {
    http.request({
        host:'127.0.0.1',
        path:'/'
    },function(res){
        results.push(res.statusCode);
        j++;

        if (j==i) { // last request
            console.log(JSON.stringify(results));
        }
    }).end();
}

To purely test what node is capable of and not my home broadband connection the code requests from a local Nginx server. I also avoid console.log until all the requests have returned because it is implemented as a synchronous function (to avoid losing debugging messages when a program crash).

Running the code using time I get the following results:

real    0m1.093s
user    0m0.595s
sys     0m0.154s

That's 1.093 seconds for 1000 requests which makes it very close to 1k requests per second.


The simple code above will generate OS errors if you try to make a lot of requests (like 10000 or more) because node will happily try to open all those sockets in the for loop (remember: the requests don't start until the for loop ends, they are only created). You mentioned that your solution also runs into the same errors. To avoid this you should limit the number of parallel requests you make.

The simplest way of limiting number of parallel requests is to use one of the Limit functions form the async.js library:

var http = require('http');
var async = require('async');

var requests = [];

// Build a large list of requests:
for (i=0;i<10000;i++) {
    requests.push(function(callback){
        http.request({
            host:'127.0.0.1',
            path:'/'
        },function(res){
            callback(null,res.statusCode);
        }).end()
    });
}

// Make the requests, 100 at a time
async.parallelLimit(requests, 100,function(err, results){
    console.log(JSON.stringify(results));
});

Running this with time on my machine I get:

real    0m8.882s
user    0m4.036s
sys     0m1.569s

So that's 10k request in around 9 seconds or roughly 1.1k/s.

Look at the functions available from async.js.

slebetman
  • 109,858
  • 19
  • 140
  • 171
  • Good approach but what does it mean 'Running with time'? – lesimoes Jul 20 '17 at 18:29
  • 2
    @lesimoes: `time` is a program available on most unix OSes including Linux and MacOS. Typically, for commands like `cd` or `ls` or `grep` or `awk` or `time` most unix users make the assumption that everyone already knows about them so no introductions are necessary. To run a program with `time` you just type `time my_program`. In this case you'd type `time node my_script.js` – slebetman Jul 21 '17 at 03:23
  • Nice! Thanks a lot! – lesimoes Jul 21 '17 at 13:30
  • Ok create an instance on DO with 1gb memory and tell me how many requests per second you get – Eugene Jan 08 '21 at 18:18
4

I've found solution for me, it is not very good, but works:

childProcess = require('child_process')

I'm using curl:

childProcess.exec('curl --max-time 20 --connect-timeout 10 -iSs "' + options.url + '"', function (error, stdout, stderr) { }

This allows me to run 800-1000 curl processes simultaneously. Of course, this solution has it's weekneses, like requirement for lots of open file decriptors, but works.

I've tried node-curl bindings, but that was very slow too.

druidvav
  • 129
  • 1
  • 1
  • 7