1

i have an strange issue. I´m using request for file download in NodeJs and everytime i download larger files(>250mb) they get downloaded into memory and are not directly streamed to the Filesystem. Maybe i´m doing something wrong but i made a testcase and the file is still not getting streamed.

var request = require('request');
var fs = require('fs');
    var writable = fs.createWriteStream("1GB.zip");
    var stream = request.get({
        uri: "http://ipv4.download.thinkbroadband.com/1GB.zip",
        encoding: null
    }, function(error, response, body) {
        console.log("code:", response.statusCode);
        if (response.statusCode >= 500) {
            log.err(response.statusCode, " Servererror", file.url);
        }
    }).pipe(writable);

in this testcase i`m downloading a sample 1GB file and if you watch the node proccess with the taskmanager it grows to >1GB as it downloads the file. I want that my Node application uses not more than 200mb of Ram

takemylemons
  • 65
  • 1
  • 2
  • 7

1 Answers1

3

The issue is that you're passing a callback, which implicitly enables buffering inside request because one of the parameters for the callback is the entire body of the response.

If you want to know when the response is available, just listen for the response event instead:

var request = require('request');
var fs = require('fs');
var writable = fs.createWriteStream("1GB.zip");
var stream = request.get({
  uri: "http://ipv4.download.thinkbroadband.com/1GB.zip",
  encoding: null
}).on('response', function(response) {
  console.log("code:", response.statusCode);
  if (response.statusCode >= 500) {
    log.err(response.statusCode, " Servererror", file.url);
  }
}).pipe(writable);
mscdex
  • 104,356
  • 15
  • 192
  • 153