2

I would like to send a lot of http requests in order to load test a server. Every request has to create a new connection and should wait until the response has been received before closing the connection.
Also, for every request a new URL is generated, I cannot send the requests to the same URL.

At first I thought using http.request would probably be the smartest way because it's a native module. So I tried this:

for(let i=0; i<100; i++) {
    setInterval(()=>{
        if (reqCount >= maxConcurrentRequests)
            return
        reqCount++

        const req = http.request({
            hostname: config.hostname,
            port: config.port,
            path: getRandomPath(),
            method: 'GET',
            agent:false
        },()=>{
            reqCount--
            showPerformance() // is used to calculate and display performance
        })
        req.on('error', (e) => {
            failedCount++
        })
        req.end()
    },0);
}

At first it sends about 600 requests per second and then after around 10 seconds, it starts to drop and goes down to 5-20 requests per second within 10 seconds or so.

Then I decided to go another level lower and created my own http packets and sent them directly using tcp sockets with the net module:

for(let i=0; i<100; i++) {
    setInterval(()=>{
        if (reqCount >= maxConcurrentRequests)
            return
        reqCount++

        const socket = new net.Socket()
        socket.connect(config.port, config.hostname, () => {
            socket.write(`GET http://${config.hostname}:${config.port}/${getRandomPath()} HTTP/1.1\r\nhostname: ${config.hostname}\r\nconnection: close\r\n\r\n`)
        })  
        socket.on("data", data => {
            reqCount--
            showPerformance() // is used to calculate and display performance
        })  
        socket.on("error", err => {
            failedCount++
        })  
    },0);
}

But the result was pretty much the same.

Why is this happening, though? My network bandwidth isn't even remotely exhausted while the load test is running and the server should easily be able to handle more than 600 requests per second. The RAM usage of the node process seems to stay between 50.000k and 100.000k and the CPU usage is around 1-3%.

I tried it with 10, 100 and even 1000 concurrent connections, but after 100 it doesn't really make a big difference.

Forivin
  • 14,780
  • 27
  • 106
  • 199
  • could be an asynchronous problem? You are sending a lot of requests but it is one by one in a sequential manner, so I think it would be difficult to send 600 requests. You may want to use node workers (thread equivalent in other programming languages). This way you can send multiple requests in parallel – anlijudavid Feb 01 '18 at 16:02
  • 1
    I don't think you need the setInterval, that may cause some problem. Just try to `for(let i ... i<1000000` or any big number you want. – HMR Feb 01 '18 at 16:50
  • There are specific tools to do that like Apache Benchmark (ab), have you considered using them? – LMC Feb 01 '18 at 18:11
  • @LuisMuñoz No, I need to generate the urls. I don't think this could be done with other tools. @ HMR It most definitely cannot be a for loop. @ anlijudavid 600 requests should not at all be a problem. As I said my bandwidth isn't even remotely exhausted and so is my CPU and also RAM. – Forivin Feb 01 '18 at 19:21
  • Sending these many request a second can cause DOS. Beware . – Neethu Lalitha Nov 08 '20 at 21:15

1 Answers1

0

I cannot add comment. But I think because of concurrent requests limit. More you can read in this answer.

Why is node.js only processing six requests at a time?

Talgat Saribayev
  • 1,898
  • 11
  • 19
  • No it is not. First of all there is no default limit anymore and secondly I have set agent to false as you can see. – Forivin Feb 01 '18 at 19:32