159

I have a library that takes as input a ReadableStream, but my input is just a base64 format image. I could convert the data I have in a Buffer like so:

var img = new Buffer(img_string, 'base64');

But I have no idea how to convert it to a ReadableStream or convert the Buffer I obtained to a ReadableStream.

Is there a way to do this?

vvvvv
  • 25,404
  • 19
  • 49
  • 81
Masiar
  • 20,450
  • 31
  • 97
  • 140

9 Answers9

221

For nodejs 10.17.0 and up:

const { Readable } = require('stream');

const stream = Readable.from(myBuffer);
Steffan
  • 704
  • 1
  • 11
  • 25
iamarkadyt
  • 2,740
  • 2
  • 13
  • 13
98

something like this...

import { Readable } from 'stream'

const buffer = new Buffer(img_string, 'base64')
const readable = new Readable()
readable._read = () => {} // _read is required but you can noop it
readable.push(buffer)
readable.push(null)

readable.pipe(consumer) // consume the stream

In the general course, a readable stream's _read function should collect data from the underlying source and push it incrementally ensuring you don't harvest a huge source into memory before it's needed.

In this case though you already have the source in memory, so _read is not required.

Pushing the whole buffer just wraps it in the readable stream api.

David Braun
  • 5,573
  • 3
  • 36
  • 42
Mr5o1
  • 1,698
  • 1
  • 15
  • 20
  • 2
    Wouldn't it be more correct to `push()` the buffer inside the `_read()` method? I.e. `readable._read = () => {readable.push(buffer); readable.push(null);}` . Not sure it matters, but allowing the stream to manage the timing of when data is fed into seems less likely to run into unexpected behavior. Other than that this should be the accepted answer, as it doesn't rely on 3rd party modules. – broofa Oct 04 '18 at 23:53
  • 1
    Generally, you'd be right, but for this specific use case I wouldn't `push` inside the `read` method. Conceptually I think `_read` should be reserved for "harvesting" data from an underlying source. In this case we not only have the data in memory, but no conversion is required. So for wrapping data in a stream this is how I would do it, but for converting or accumulating data in a stream, that logic would happen in the `_read` method. – Mr5o1 Oct 05 '18 at 05:02
  • Your underlying source is the buffer ;) – Franck Freiburger Mar 04 '20 at 15:50
  • @FranckFreiburger Yes, but you're not "harvesting" data from that source, it's already in memory and you're always going to consume it all in one go, you're not pulling it in on demand. – Mr5o1 Mar 04 '20 at 22:44
36

Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use.

Gabriel Llamas suggests streamifier in this answer: How to wrap a buffer as a stream2 Readable stream?

Community
  • 1
  • 1
Bryan Larsen
  • 9,468
  • 8
  • 56
  • 46
31

You can create a ReadableStream using Node Stream Buffers like so:

// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
  frequency: 10,      // in milliseconds.
  chunkSize: 2048     // in bytes.
}); 

// With a buffer
myReadableStreamBuffer.put(aBuffer);

// Or with a string
myReadableStreamBuffer.put("A String", "utf8");

The frequency cannot be 0 so this will introduce a certain delay.

Doug
  • 6,446
  • 9
  • 74
  • 107
vanthome
  • 4,816
  • 37
  • 44
  • Thanks, even though a bit late. I don't remember how I solved the problem, but this looks a nice solution. If anybody confirm this it would be great. I remember finding ZERO about this conversion. – Masiar Jan 17 '13 at 10:20
  • 1
    Confirming that it works - found this when looking up how to turn filebuffers into streams. – Jack Lawson Mar 28 '13 at 18:12
  • If you have files you deal with files you should rather open a file read stream straight away with this: http://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options – vanthome Apr 01 '13 at 18:05
  • 2
    Milliseconds is not a measurement of frequency - I suppose they mean period. – UpTheCreek Jan 19 '16 at 19:11
  • @UpTheCreek I cannot change it as this is the property name and the unit IS milliseconds. – vanthome Jan 19 '16 at 22:16
  • @vanthome - I wasn't suggesting it was your fault :) Just poor naming on the part of the node-stream-buffer devs. – UpTheCreek Jan 20 '16 at 10:43
  • _"Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use."_ — [Bryan Larsen](https://stackoverflow.com/a/18190310/65732) – sepehr Mar 23 '18 at 12:24
21

You can use the standard NodeJS stream API for this - stream.Readable.from

const { Readable } = require('stream');
const stream = Readable.from(buffer);

Note: Don't convert a buffer to string (buffer.toString()) if the buffer contains binary data. It will lead to corrupted binary files.

Ihor Sakailiuk
  • 5,642
  • 3
  • 21
  • 35
10

You don't need to add a whole npm lib for a single file. i refactored it to typescript:

import { Readable, ReadableOptions } from "stream";

export class MultiStream extends Readable {
  _object: any;
  constructor(object: any, options: ReadableOptions) {
    super(object instanceof Buffer || typeof object === "string" ? options : { objectMode: true });
    this._object = object;
  }
  _read = () => {
    this.push(this._object);
    this._object = null;
  };
}

based on node-streamifier (the best option as said above).

Joel Harkes
  • 10,975
  • 3
  • 46
  • 65
7

Here is a simple solution using streamifier module.

const streamifier = require('streamifier');
streamifier.createReadStream(new Buffer ([97, 98, 99])).pipe(process.stdout);

You can use Strings, Buffer and Object as its arguments.

Nick Larsen
  • 18,631
  • 6
  • 67
  • 96
Shwetabh Shekhar
  • 2,608
  • 1
  • 23
  • 36
  • 1
    Another equivalent alternative is [tostream](https://github.com/seedalpha/tostream): `const toStream = require('tostream'); toStream(new Buffer ([97, 98, 99])).pipe(process.stdout);` – Yushin Washio Dec 15 '19 at 20:51
  • 1
    @YushinWashio Definitely. Plenty of modules are available in Node. – Shwetabh Shekhar Dec 16 '19 at 06:41
7

This is my simple code for this.

import { Readable } from 'stream';

const newStream = new Readable({
                    read() {
                      this.push(someBuffer);
                    },
                  })
Richard Vergis
  • 1,037
  • 10
  • 20
2

Try this:

const Duplex = require('stream').Duplex;  // core NodeJS API
function bufferToStream(buffer) {  
  let stream = new Duplex();
  stream.push(buffer);
  stream.push(null);
  return stream;
}

Source: Brian Mancini -> http://derpturkey.com/buffer-to-stream-in-node/

mraxus
  • 1,377
  • 1
  • 15
  • 23