5

I'm wondering how node.js oprates if you pipe two different read streams into the same destination at the same time. For example:

var a = fs.createReadStream('a')
var b = fs.createReadStream('b')
var c = fs.createWriteStream('c')
a.pipe(c, {end:false})
b.pipe(c, {end:false})

Does this write a into c, then b into c? Or does it mess everything up?

B T
  • 57,525
  • 34
  • 189
  • 207
  • Streams are built on event emitters. This allows chunks of data to be read and piped to the file you're writing to. It is safest to assume that each chunk can arrive from any pipe, so what you get in your file may be garbled. Tests on my machine show simple concatenation for small files on Node v0.11.10 though. Even so, I would avoid doing this. – qubyte Feb 14 '14 at 00:48
  • 1
    That's what I'd expect too. So I'd have to wait until a ends before starting b then. – B T Feb 14 '14 at 01:31

1 Answers1

4

You want to add the second read into an eventlistener for the first read to finish.

var a = fs.createReadStream('a');
var b = fs.createReadStream('b');
var c = fs.createWriteStream('c');
a.pipe(c, {end:false});
a.on('end', function() {
  b.pipe(c)
}
  • 2
    could this be achieved with a dynamic amount of readable streams? in some sort of loop on an array of readable streams? – user1063287 Jul 23 '19 at 01:33