70

How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?

I already found this answer and the stream-buffers module but this module is based on the stream1 interface.

Community
  • 1
  • 1
Jerome WAGNER
  • 21,986
  • 8
  • 62
  • 77

3 Answers3

181

The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.

var stream = require('stream');

// Initiate the source
var bufferStream = new stream.PassThrough();

// Write your buffer
bufferStream.end(Buffer.from('Test data.'));

// Pipe it to something else  (i.e. stdout)
bufferStream.pipe(process.stdout)
zjonsson
  • 2,196
  • 2
  • 12
  • 12
  • 12
    Unless node.js does so internally, this solution doesn't slice up the buffer into smaller chunks and so might not be ideal for some pipe destinations. *But* if you look, neither does the streamifier library from the accepted answer. So +1 for keeping it simple. – natevw Feb 06 '14 at 20:02
  • 3
    I do wonder if using `var bufferStream = stream.PassThrough();` might make the intent clearer to later readers of the code, though? – natevw Feb 06 '14 at 20:12
  • 1
    Also, note that if your destination expects the stream to finish at some point you'll likely need to call `bufferStream.end()`. – natevw Feb 06 '14 at 21:01
  • 7
    @natevw There's no need to slice the buffer because the internal code of streams2 takes care of it (search "fromList", [here](https://github.com/joyent/node/blob/master/lib/_stream_readable.js)). Actually, if you slice the buffer, the performance will be worse because if the stream needs to read more bytes than the buffer length, then if you slice it, streams2 will concat them again ([here](https://github.com/joyent/node/blob/master/lib/_stream_readable.js#L832)). – Gabriel Llamas Feb 06 '14 at 21:59
  • This requires two steps while `streamifier` only requires one. – binki Oct 24 '16 at 00:07
  • How would one test or set the chunk size? –  Jun 02 '17 at 03:19
  • I am not able to call events on the bufferStream object. For ex, bufferStream .on('readable', function () { var chunk; while (null !== (chunk = bufferStream .read())) { //Do something } }) .on('end', function () { //Do something }); – Shaik Syed Ali Mar 11 '19 at 07:43
  • Note that using the `Buffer` constructor has been deprecated. Use the `Buffer.from('Test data.')` method instead. – Boaz May 31 '20 at 13:00
36

As natevw suggested, it's even more idiomatic to use a stream.PassThrough, and end it with the buffer:

var buffer = new Buffer( 'foo' );
var bufferStream = new stream.PassThrough();
bufferStream.end( buffer );
bufferStream.pipe( process.stdout );

This is also how buffers are converted/piped in vinyl-fs.

morris4
  • 1,977
  • 17
  • 14
  • 2
    Why would you `end` with the entire buffer? And why does `end` come after `pipe` here – Startec Feb 16 '15 at 02:36
  • 5
    `end( buffer )` is just `write( buffer )` and then `end()`. I end the stream because it is not needed anymore. The order of end/pipe does not matter here, because PassThrough only starts emitting data when there's some handler for data events, like a pipe. – morris4 Feb 17 '15 at 16:54
  • 1
    @Startec Not slicing up the buffer means less overhead. If your consumer cannot handle large chunks, then guard it with something that splits chunks. – binki Oct 24 '16 at 14:50
3

A modern simple approach that is usable everywhere you would use fs.createReadStream() but without having to first write the file to a path.

const {Duplex} = require('stream'); // Native Node Module 

function bufferToStream(myBuuffer) {
    let tmp = new Duplex();
    tmp.push(myBuuffer);
    tmp.push(null);
    return tmp;
}

const myReadableStream = bufferToStream(your_buffer);
  • myReadableStream is re-usable.
  • The buffer and the stream exist only in memory without writing to local storage.
  • I use this approach often when the actual file is stored at some cloud service and our API acts as a go-between. Files never get wrote to a local file.
  • I have found this to be the very reliable no matter the buffer (up to 10 mb) or the destination that accepts a Readable Stream. Larger files should implement
factorypolaris
  • 2,757
  • 12
  • 15