0

I'm quite new to nodejs and I'm doing some experiments. What I get from them (and I hope I'm wrong!) is that nodejs couldn't serve many concurrent requests for the same resource without putting them in sequence.

Consider following code (I use Express framework in the following example):

var express = require('express');
var app = express();

app.get('/otherURL', function (req, res) {
res.send('otherURL!');
});

app.get('/slowfasturl', function (req, res) 
{

    var test = Math.round(Math.random());

    if(test == "0")
    {
        var i;
        setTimeout
        (
            function()
            {
                res.send('slow!');

            }, 10000
        );

    }
    else
    {
        res.send('fast!');
    }

});

app.listen(3000, function () {
      console.log('app listening on port 3000!');
    });

The piece of code above exposes two endpoints:

scenario 1 : reply immediately with "fast!" simple text

or

scenario 2 : reply after 10 seconds with "slow!" simple text

My test: I've opened several windows of chrome calling at the same time the slowfasturl URL and I've noticed that the first request that falls in the "scenario 2", causes the blocking of all the other requests fired subsequentely (with other windows of chrome), indipendently of the fact that these ones are fallen into "scenario 1" (and so return "slow!") or "scenario 2" (and so return "fast!"). The requests are blocked at least until the first one (the one falling in the "scenario 2") is not completed.

How do you explain this behavior? Are all the requests made to the same resource served in sequence?

I experience a different behavior if while the request fallen in the "scenario 2" is waiting for the response, a second request is done to another resource (e.g. the otherurl URL explained above). In this case the second request is completed immediately without waiting for the first one

thank you

Davide

3 Answers3

0

As far as I remember, the requests are blocked browser side.

Your browser is preventing those parallel requests but your server can process them. Try in different browsers or using curl and it should work.

AsTeR
  • 7,247
  • 14
  • 60
  • 99
  • Dear @AsTeR , I did a further check: you are right if for example I use different browsers (e.g. one request with Chrome and another with IE) the requests seems to be managed in parallel. But if I replace the setTimeout with a loop checking the Date.now() (that is executed synchronously), the request are managed in sequence! Can we say the following?: the requests are managed in sequence by nodejs listener, until the processing is delegating to a callback function (and this is the case of setTimeout) – Davide Scicolone Apr 26 '16 at 10:08
  • setTimeout is adding an event to process in the future, when you set a loop, your thread is blocked on that loop. You should never block anything in JavaScript or it will freeze like you are noticing. I have noticed that you are new here: "welcome!", if my answer is valid you can check the green tick next to it. – AsTeR Apr 26 '16 at 11:56
0
  1. The behavior you observe can only be explained through any sequencing which browser does. Node does not service requests in sequence, instead it works on an event driven model, leveraging the libuv framework

  2. I have ran your test case with non-browser client, and confirmed that requests do not influence each other.

To gain further evidence, I suggest the following:

  1. Isolate the problem scope. Remove express (http abstraction) and use either http (base http impl), or even net (TCP) module.
  2. Use non-browser client. I suggest ab (if you are in Linux) - apache benchmarking tool, specifically for web server performance measurement. I used

ab -t 60 -c 100 http://127.0.0.1:3000/slowfasturl

  • collect data for 60 seconds, for 100 concurrent clients.
    1. Make it more deterministic by replacing Math.random with a counter, and toggling between a huge timeout and a small timeout.
    2. Check result to see the rate and pattern of slow and fast responses.

Hope this helps.

Gireesh Punathil
  • 1,344
  • 8
  • 18
  • I mark this as the correct one since is the most complete – Davide Scicolone Apr 26 '16 at 07:29
  • Dear @Gireesh Punathil , I did a further check: you are right if for example I use different browsers (e.g. one request with Chrome and another with IE) the requests seems to be managed in parallel. But if I replace the setTimeout with a loop checking the Date.now() (that is executed synchronously), the request are managed in sequence! Can we say the following?: the requests are managed in sequence by nodejs listener, until the processing is delegating to a callback function (and this is the case of setTimeout) – Davide Scicolone Apr 26 '16 at 10:03
  • Davide, nice to see your experiment-based inference gathering model, which I also follow quite often. Your inference is right, a more comprehensive interpretation of this behavior is like this: Requests are managed in sequence as long as they are execute-able in the machine without external (peripheral) dependency, until it encounters an operation which involves such dependency and thereby, latency. A detailed sequence of events which describe the execution chronology in Node can be found here: http://stackoverflow.com/questions/10680601/nodejs-event-loop – Gireesh Punathil Apr 26 '16 at 12:48
  • Thank you for your response. can we say that the best way is then to delegate to another process (our callback function) as soon as possible? is this a good approach? – Davide Scicolone Apr 26 '16 at 18:49
0

Davide: This question needs an elaboration, so adding as another answer rather than comment, which has space constraints.

If you are hinting at the current node model as a problem:

Traditional languages (and runtimes) caused code to be run in sequence. Threads were used to scale this but has side effects such as: i) shared data access need sequencing, ii) I/O operations block. Node is the result of a careful integration between three entities libuv(multiplexer), v8 (executor), and node (orchestrator) to address those issues. This model ensures improved performance and scalability under web and cloud deployments. So there is no problem with this approach.

If you are hinting at further improvements to manage stressful CPU bound operations in node where there will be waiting period yes, leveraging the multi-core and introducing more threads to share the CPU intensive workload would be the right way.

Hope this helps.

Gireesh Punathil
  • 1,344
  • 8
  • 18