42

How to convert stream into buffer in nodejs? Here is my code to parse a file in post request in express.

app.post('/upload', express.multipart({
defer: true
}), function(req, res) {
req.form.on('part', function(part) {

//Here I want to convert the streaming part into a buffer.
//do something buffer-specific task

  var out = fs.createWriteStream('image/' + part.filename);
  part.pipe(out);
});

req.form.on('close', function() {
    res.send('uploaded!');
  });
});
sam100rav
  • 3,733
  • 4
  • 27
  • 43

2 Answers2

49

Instead of piping, you can attach readable and end event handlers to the part stream to read it:

var buffers = [];
part.on('readable', function(buffer) {
  for (;;) {
    let buffer = part.read();
    if (!buffer) { break; }
    buffers.push(buffer);
  }
});
part.on('end', function() {
  var buffer = Buffer.concat(buffers);
  ...do your stuff...

  // write to file:
  fs.writeFile('image/' + part.filename, buffer, function(err) {
    // handle error, return response, etc...
  });
});

Note: If you instead use data, it will read the entire upload into memory.

You could also create a custom transform stream to transform the incoming data, but that might not be trivial.

coolaj86
  • 74,004
  • 20
  • 105
  • 125
robertklep
  • 198,204
  • 35
  • 394
  • 381
  • Should use on('readable'), not on('data'). The prior works efficiently. The later causes memory leaks. – coolaj86 Aug 10 '21 at 02:21
  • Note that no parameter is sent for the 'readable' event - `buffer` is just a trick by the author to skip explicit variable declaration. It will always initially be `undefined` – Gershom Maes Feb 02 '22 at 00:04
  • @GershomMaes it's actually an artifact of the edit from @coolaj86 (`buffer` is declared inside the `for` loop so it's not a trick). I'll leave it for posterity. – robertklep Feb 02 '22 at 09:11
  • 2
    Np! For the record I wasn't critiquing your style - I was just concerned for the poor dev who tries to use parameters sent from the `readable` event :) – Gershom Maes Feb 02 '22 at 18:52
25

You can use the stream-to module, which can convert a readable stream's data into an array or a buffer:

var streamTo = require('stream-to');
req.form.on('part', function (part) {
    streamTo.buffer(part, function (err, buffer) {
        // Insert your business logic here
    });
});

If you want a better understanding of what's happening behind the scenes, you can implement the logic yourself, using a Writable stream. As a writable stream implementor, you only have to define one function: the _write method, that will be called every time some data is written to the stream. When the input stream is finished emitting data, the end event will be emitted: we'll then create a buffer using the Buffer.concat method.

var stream = require('stream');
var converter = new stream.Writable();

// We'll store all the data inside this array
converter.data = [];
converter._write = function (chunk) {
    converter.data.push(chunk);
};

// Will be emitted when the input stream has ended, 
// i.e. no more data will be provided
converter.on('finish', function() {
    // Create a buffer from all the received chunks
    var b = Buffer.concat(this.data);

    // Insert your business logic here
});
coolaj86
  • 74,004
  • 20
  • 105
  • 125
Paul Mougel
  • 16,728
  • 6
  • 57
  • 64