I'm trying to understand the limits of nodejs when dishing lots of content under load. Specifically I want to know if streaming a lengthy response to a client will block.
I've created a very simple test setup where node just responds to every request with a stream that's pulling from a 1GB http download. Here's my code:
var http = require('http');
var fs = require('fs');
var iterator = 0;
http.createServer(function(req, res) {
console.log('req received ', iterator++);
var url = 'http://download.thinkbroadband.com/1GB.zip';
http.get(url, bigFile => {
res.writeHead(200, {
'content-type': 'application/zip',
'content-length': bigFile.headers['content-length'],
});
bigFile.pipe(res);
});
}).listen(8003);
So I launched this node server and hit the endpoint with several tabs in my browser. What was interesting was that subsequent responses don't immediately log with the console.log('request received ', iterator++);
code. Instead there's a delay of 5 to 10 seconds before that initial event is logged.
This is strange to me because if streaming an http response is blocking then it should wait until the first request is complete before accepting the second. If the streaming doesn't block then I would expect to see all requests logged immediately after they're requested.
Can someone explain this?
I'd also love to hear any thoughts about performance here. Node probably isn't really built for this sort of thing. The download speed really suffers with multiple requests.