I wrote a javascript client-server application where i send 1000 HTTP requests to my server (listening on localhost:5000) and await the response for each:
const SendRequest = async () => {
const res = await fetch("http://localhost:5000")
return res
}
for(let i=0;i<1000;i++) {
SendRequest().then(res => console.log(res.status))
console.log("request sent")
}
When I executed the above code using nodejs, 1000 "request sent" messages were logged to the console, and then after some time, I saw 1000 responses from the server being logged in quick succession.
From my understanding, Javascript is single threaded, but nodejs can use other threads to handle things like setTimeout and awaiting an HTTP response (ref: What the heck is the event loop anyway? | Philip Roberts | JSConf EU). My system has 6 cores and 12 threads. My question is that how can it await 1000s of such operations at once even if it uses up all the 12 threads?
Here is the server side code:
app.get("/", async (req, res) => {
console.log("Connection established");
setTimeout(() => {
console.log("Sending res")
res.send("Hello from the server")
}, 5000) //send response after waiting for 5 seconds
})
Here, I saw "Connection established" logged 1000 times when I sent the requests from the client and 5 seconds later, "Sending res" was logged 1000 times within a matter of milliseconds and all the 1000 responses sent at once which were received by the client. Again, my question is similar, how did these 1000 setTimeouts, each of 5 seconds run concurrently even if we use all the threads in the system?