I would like to send a lot of http requests in order to load test a server. Every request has to create a new connection and should wait until the response has been received before closing the connection.
Also, for every request a new URL is generated, I cannot send the requests to the same URL.
At first I thought using http.request would probably be the smartest way because it's a native module. So I tried this:
for(let i=0; i<100; i++) {
setInterval(()=>{
if (reqCount >= maxConcurrentRequests)
return
reqCount++
const req = http.request({
hostname: config.hostname,
port: config.port,
path: getRandomPath(),
method: 'GET',
agent:false
},()=>{
reqCount--
showPerformance() // is used to calculate and display performance
})
req.on('error', (e) => {
failedCount++
})
req.end()
},0);
}
At first it sends about 600 requests per second and then after around 10 seconds, it starts to drop and goes down to 5-20 requests per second within 10 seconds or so.
Then I decided to go another level lower and created my own http packets and sent them directly using tcp sockets with the net
module:
for(let i=0; i<100; i++) {
setInterval(()=>{
if (reqCount >= maxConcurrentRequests)
return
reqCount++
const socket = new net.Socket()
socket.connect(config.port, config.hostname, () => {
socket.write(`GET http://${config.hostname}:${config.port}/${getRandomPath()} HTTP/1.1\r\nhostname: ${config.hostname}\r\nconnection: close\r\n\r\n`)
})
socket.on("data", data => {
reqCount--
showPerformance() // is used to calculate and display performance
})
socket.on("error", err => {
failedCount++
})
},0);
}
But the result was pretty much the same.
Why is this happening, though? My network bandwidth isn't even remotely exhausted while the load test is running and the server should easily be able to handle more than 600 requests per second. The RAM usage of the node process seems to stay between 50.000k and 100.000k and the CPU usage is around 1-3%.
I tried it with 10, 100 and even 1000 concurrent connections, but after 100 it doesn't really make a big difference.