0

I'm trying to stream a cat command using the ssh2 module but it just hangs at some point of the execution. I'm executing cat there.txt where there.txt is around 10 MB or so.

For example:

local = fs.createWriteStream('here.txt');
conn.exec('cat there.txt', function(err, stream) {
     if (err) throw err;
     stream.pipe(local).on('finish, function() { console.log('Done'); });
}

This just completely stops at one point. I've even piped the stream to local stdout, and it just hangs after a while. In my actual code, I pipe it through a bunch of other transform streams so I think this is better than transferring the files to the local system first (the files may get larger than 200MB).

  • I know this doesn't exactly answer your question, but couldn't you just [scp](https://stackoverflow.com/a/343723/897968) it? – FriendFX Jun 01 '15 at 03:38
  • I should've mentioned that I also want to use this for commands that I execute on the remote server that don't necessarily output a file (just to stdout). I'm not sure what would be faster: 1) write the command's output to a remote file, transfer that file to the local server, and transform it and then write it to a local file. or 2) make them output to stdout like they do and stream that to the local server, transform it and then write it to a local file. – nightorday Jun 01 '15 at 11:40
  • 1
    What do you mean by "hangs?" Do you mean that the local process is still running? If so, you need to end the connection when the exec stream closes (or when your writable fs stream finishes, either way). FWIW I've tried your code repeatedly with a 10MB file and it works every time for me. – mscdex Jun 01 '15 at 15:56
  • Also, could you mention what ssh2 module version and node/io.js version you're using? – mscdex Jun 01 '15 at 15:58
  • I just updated to 0.4.8 and I have the same problem. By hanging, I mean some of the data is transferred to my local fs stream, but the ssh stream no longer pipes anything to it after 25% of the data is transferred. Even though the process on the remote server is still running. – nightorday Jun 02 '15 at 13:59

2 Answers2

0

I had just started working with streams recently so I when I was piping the ssh stream through various transform streams, I wasn't ending on a writable stream like I was in my example (I should've included my actual code, sorry!). This caused it to hang. This was originally so that I could execute multiple commands remotely and put their output sorted into a single file.

So, my original code was stream.pipe(transformStream), then push the transformStream to an array once it's finished. And then sort it using the mergesort-stream npm module. Instead of that, I just write the results from the multiple ssh commands (transformed) to temporary files and then sort them all at once.

-1

Try out the createReadStream for serving huge files:

fs.exists(correctfilepath, function(exists) {
  if (exists) {
    var readstream = fs.createReadStream(correctfilepath);

    console.log("About to serve " + correctfilepath);

    res.writeHead(200);
    readstream.setEncoding("binary");

    readstream.on("data", function (chunk) {
        res.write(chunk, "binary");
    });

    readstream.on("end", function () {
        console.log("Served file " + correctfilepath);
        res.end();
    });

    readstream.on('error', function(err) {
        res.write(err + "\n");
        res.end();
        return;
    });
  } else {
    res.writeHead(404);
    res.write("No data\n");
    res.end();
  }
});
Aleksandr
  • 2,185
  • 2
  • 21
  • 29