0

I do request to api endpoint which contains a lot of data and needs some time to fetch. My request sometime is successful but sometimes I get ETIMEDOUT error. I tried increasing the timeout of the request but this doesn't solve my problem. Is there a way to load the data in chunks or to increase the server timeout?

shane_00
  • 101
  • 1
  • 2
  • 9

3 Answers3

1

Using module http with http.request() you can set a timeout like explained here: How to set a timeout on a http.request() in Node?

Note you can load data in chunk res.on('data', ...) event in your callback like :

const req = http.request(options, (res) => {
  res.on('data', (chunk) => {
    console.log(`BODY: ${chunk}`);
  });
  res.on('end', () => {
    console.log('No more data in response.');
  });
});
req.end();

Code and more details at : https://nodejs.org/api/http.html#http_http_request_url_options_callback

qphi
  • 121
  • 4
  • Yes, streams using request should work. I also thought of streams but I asked to use in server endpoint. But, since this is a 3rd party API, your solution should be good. – Rupjyoti Aug 28 '20 at 03:07
0

Yes, there is a way to send response in chunks. You need to use Node.js streams.

Below is an example.

http.createServer(function(req, res) {
  // The filename is simple the local directory and tacks on the requested url
  var filename = __dirname+req.url;

  // This line opens the file as a readable stream
  var readStream = fs.createReadStream(filename);

  // This will wait until we know the readable stream is actually valid before piping
  readStream.on('open', function () {
  // This just pipes the read stream to the response object (which goes to the client)
  **readStream.pipe(res);**
  });

  // This catches any errors that happen while creating the readable stream (usually invalid names)
  readStream.on('error', function(err) {
    res.end(err);
      });
  }).listen(8080);

The above code is from Node.js documentation.

Focus on the part "readStream.pipe(res);"

The res is being send continuously along with reading the file. If the file is large, it will still be able to send slowly and continuously to the client.

Check documentation,

https://nodejs.org/en/knowledge/advanced/streams/how-to-use-fs-create-read-stream/

Similarly, you can also allow client to stream large video file, lets say 750MB with this stream process. Just, there are some more complications to handle videos.

Rupjyoti
  • 394
  • 1
  • 7
  • Is this is for when trying to fetch from a 3rd party API? – shane_00 Aug 27 '20 at 21:14
  • No, this is to send a large data from server to client. To fetch data from 3rd party API , use streams in opposite way to read the response. Check answer by @qphi – Rupjyoti Aug 28 '20 at 02:36
0

As people already mentioned, use Streams.

Suppose a client requests a big file from our server (file.txt in code below). Using streams we send this file in chunks instead of buffering it in memory. Server doesn't consume a lot of RAM and client gets immediate response, everyone is happy.

const fs = require("fs");
const http = require("http");
const server = http.createServer();
const port = 8000;

server.on("request", (req, res) => {
  const src = fs.createReadStream("./file.txt"); // file.txt is some big file
  src.pipe(res);
  fs.readFile("./file.txt", (err, data) => {
    if (err) throw err;
    res.end(data);
  });
});

server.listen(port, () => { `Server is listening on http://localhost:${port}` });
Andrey
  • 25
  • 1
  • 1
  • 5