3

I am trying to write a Node.js application which accepts incoming requests from client and then make a call to some web service on the remote server to retrieve data.

const express = require('express')
const request = require('request')
const moment = require('moment')

const app = express()

app.get('/', (req, res) => {

    request('http://localhost/sleep.php', (error, response, body) => {
        res.send('get data at ' + moment().format())
    })

})

app.listen(3000)

The remote service is written in PHP:

<?php

    sleep(10);
    echo 'Return something!';

The problem is that if the new request comes in then Node.js is blocked until it has finished the last callback. How to fix this? Any ideas?

Update:

I actually make two requests at the same time via Firefox and the second request spent almost 20 seconds.

Please see the image here

Hannah
  • 43
  • 6
  • googling "nodejs async await" produced this result: https://blog.risingstack.com/mastering-async-await-in-nodejs/ – Alexander Taran Feb 05 '18 at 14:02
  • Possible duplicate of [Asynchronous http calls with nodeJS](https://stackoverflow.com/questions/17106622/asynchronous-http-calls-with-nodejs) – Alexander Taran Feb 05 '18 at 14:03
  • @AlexanderTaran I don't see how this is related to async/await... – Andrew Li Feb 05 '18 at 14:04
  • 3
    I'm not seeing how node is blocked in this example, you're not doing a sync request. Every request takes ~10 seconds, but you can handle multiple concurrent requests right now, unless you're not showing the full code. – Marcos Casagrande Feb 05 '18 at 14:06
  • I think @MarcosCasagrande is right - the [Express website](https://expressjs.com/en/advanced/best-practice-performance.html#dont-use-synchronous-functions) says to avoid synchronous functions, which would make no sense if Express itself wasn't handling requests concurrently. – Joe Clay Feb 05 '18 at 14:08
  • the nodejs is not blocked in the example, the php server is slow because of the `sleep`. Every request now should take exactly 10 seconds – Alex Michailidis Feb 05 '18 at 14:09
  • Updated the question with my screenshot. – Hannah Feb 05 '18 at 14:50
  • Interestingly enough, I am able to achieve concurrent requests with Firefox (only with HTTP/2), but Chrome seems to be sending them one by one. – Josh Lee Feb 05 '18 at 15:27
  • 1
    More to the point: This is definitely not a PHP problem, and most likely not a Node.js problem. It's a browser and HTTP problem. – Josh Lee Feb 05 '18 at 15:45
  • In fact, changing the concurrent requests to have distinct URLs (`/?1`, `/?2`, etc.) seems to be the answer. – Josh Lee Feb 05 '18 at 15:56

2 Answers2

3

Here's a quick demonstration that concurrent requests for the same URL will not be pipelined by the browser, but different URLs generally will. Adding a distinct value to the query string is a technique to work around this: localhost:3000/?1517849200341 using Date.now() for instance.

(Broadly speaking, pipelining is disabled in HTTP/1.1, but browsers will use multiple TCP connections to the same end. Pipelining is part of HTTP/2 by setting the maximum number of streams. I don't really know what this means or how to interpret the result below.)

async function log(fn) {
  console.log(Date());
  await Promise.all(fn());
  console.log(Date());
}

const req1 = 'https://httpbin.org/delay/1';
const req2 = 'https://nghttp2.org/httpbin/delay/1';

const req3 = 'https://httpbin.org/delay/1?a';
const req4 = 'https://httpbin.org/delay/1?b';
const req5 = 'https://httpbin.org/delay/1?c';
const req6 = 'https://httpbin.org/delay/1?d';

const req7 = 'https://nghttp2.org/httpbin/delay/1?a';
const req8 = 'https://nghttp2.org/httpbin/delay/1?b';
const req9 = 'https://nghttp2.org/httpbin/delay/1?c';
const req10 = 'https://nghttp2.org/httpbin/delay/1?d';

btn1.addEventListener('click', () => log(() => [
  fetch(req1),
  fetch(req1),
  fetch(req1),
  fetch(req1),
  fetch(req1),
]));

btn2.addEventListener('click', () => log(() => [
  fetch(req2),
  fetch(req2),
  fetch(req2),
  fetch(req2),
  fetch(req2),
]));

btn3.addEventListener('click', () => log(() => [
  fetch(req1),
  fetch(req3),
  fetch(req4),
  fetch(req5),
  fetch(req6),
]));

btn4.addEventListener('click', () => log(() => [
  fetch(req2),
  fetch(req7),
  fetch(req8),
  fetch(req9),
  fetch(req10),
]));
<button id=btn1>HTTP/1.1, same URLs</button>
<button id=btn2>HTTP/2, same URLs</button>
<button id=btn3>HTTP/1.1, different URLs</button>
<button id=btn4>HTTP/2, different URLs</button>
Josh Lee
  • 171,072
  • 38
  • 269
  • 275
2

The Chrome Cache is to blame. Open in each tab a chrome dev console and click 'disable cache' then refresh each tab, you'll see the responses coming back asynchronously.I assume Firefox might also have a cache setting somewhere. Or use Postman to make multiple requests...

Or, if you really want to see this working in multiple tabs, i guess you could also disable caching from the node server (I don't recommend it for anything other than a proof of concept):

...
var nocache = require('nocache')
const app = express()
app.use(nocache())
...
Radu Luncasu
  • 975
  • 10
  • 17
  • Disabling the cache from the client side isn't a realistic solution, though. – Josh Lee Feb 05 '18 at 15:51
  • @JoshLee what do you mean ? It's not a solution, it's a way to prove that express is in fact serving asynchronously but chrome is not actually making the http requests asynchronously. There's nothing wrong with the code posted, the problem is the caching in the browser. – Radu Luncasu Feb 05 '18 at 16:02
  • That is a fair point: The goal is not actually to load the same URL in two browser windows; that is simply a technique to see whether Node.js can actually handle this. – Josh Lee Feb 05 '18 at 16:13
  • I don't think this is releated to browser cache becuse it does not work even I make a call from a incognito window . – Hannah Feb 06 '18 at 15:02
  • I also test the library `nocache` but the same result. – Hannah Feb 06 '18 at 15:11
  • @Hannah, incognito window in chrome still uses the cache. The only reliable way to disable the chrome cache is to open Dev Tools and under Network, to check the checkmark saying 'Disable cache'. If you do this you will definitely see the requests being served in parallel. – Radu Luncasu Feb 12 '18 at 18:13
  • This was driving me crazy, I was using the raw http module and had a similar issue, I thought that maybe it was incapable of the async-handling of requests, to learn that Chrome is preventing me from loading the same link even if I open a link in multiple tabs was surprising. Once disable caching was ticked, voila, it worked like a charm. – Darren S Jun 25 '21 at 11:25