2

Considering this node.js app below:

var spawn = require('child_process').spawn,
    dir = spawn('dir', ['*.txt', '/b', '/s']);

dir.stdout.on('data', function (data) {
    //(A)
    console.log('stdout: ' + data);
});

In (A), the on data event wait for stdout output and we can imagine that the output came 'line by line' from cmd /c dir *.txt /b /s.

But it doesn't happen. In data variable, the stdout output came with more than one line and to process something with each file path we have to split by CRLF (\r\n). Why this does happen?

tetri
  • 3,753
  • 2
  • 25
  • 37
  • If the child process supports streaming it will stream the response back as you'd expect one by one. The dir command obviously just returns a whole block of text to be displayed in the console window. – cillierscharl Jul 22 '13 at 14:28
  • @f0x Streaming is not the same as line-by-line. A stream could be anything; a single line, multiple lines, a partial line. – Joe Jul 22 '13 at 18:24
  • @Joe True, I guess the flow I was trying to describe is characterized by the event emitter pattern where chunks of data are emitted at a time. – cillierscharl Jul 23 '13 at 06:52

1 Answers1

2

Because this is just a pure data stream from the child process's standard output. There is no knowledge of whether that data is in any particular format, or whether it will contain any specific characters at all. So the data is treated like a stream of bytes and handled in chunks with no regard for the content or meaning of those bytes. That's the most general form of piping data around the system. Note, however, that there are wrapper streams that will buffer the raw data stream and give you a series of lines of text. You will find many modules for this on npmjs.org

Peter Lyons
  • 142,938
  • 30
  • 279
  • 274