1

I have an expect script that does passwordless ssh to another box and starts a command to dump a file content using command : tail -f ...

Due to the fact that I am using -f tail option the commands waits until new data is added into the file and dumps it immediately.

I am using exec to start the script from my NodeJS script:

var child = exec('script.sh process1 process2', function(err, stdout, stderr) {
if(err)
{
   console.log("Error");
   return;
}
var result;
while((result = stdout.toString.split("\r\n")) != null)
{
  logger.info(result);
}
}
});

But I am getting below error in console logs:

Error in expect script ::Error: maxBuffer exceeded.

Since the output is a continuous data on stdout stream how can I avail the desired objective. I tried using spawn but I get below error:

NodeJS : warning: possible EventEmitter memory leak detected. 11 listeners added. Use emitter.setMaxListeners() to increase limit

How can I overcome this issue?

EDIT::

Seems like callback or events would be called when the command in concern has finished running or returned an error. Is there a way around I can get the output even though my command is still executing in background?

Community
  • 1
  • 1
Programmer
  • 8,303
  • 23
  • 78
  • 162

1 Answers1

2

Looks like spawn is the way to go as it's mean to handle streams, whereas exec has a max buffer of 200k.

Your spawn output is just a warning: possible EventEmitter memory leak detected

Community
  • 1
  • 1
Amir T
  • 2,708
  • 18
  • 21
  • I tried adding process.setMaxListeners(0) but still get the same result. My NodeJS version is v0.8.15. Even though its a warning - but I do not get data in stdout - the logger in function child.stdout.on('data', function (data){ }); - does not gets printed in console. – Programmer Dec 24 '12 at 07:14