14

My Node.js program - which is an ordinary command line program that by and large doesn't do anything operationally unusual, nothing system-specific or asynchronous or anything like that - needs to write messages to a file from time to time, and then it will be interrupted with ^C and it needs the contents of the file to still be there.

I've tried using fs.createWriteStream but that just ends up with a 0-byte file. (The file does contain text if the program ends by running off the end of the main file, but that's not the scenario I have.)

I've tried using winston but that ends up not creating the file at all. (The file does contain text if the program ends by running off the end of the main file, but that's not the scenario I have.)

And fs.writeFile works perfectly when you have all the text you want to write up front, but doesn't seem to support appending a line at a time.

What is the recommended way to do this?

Edit: specific code I've tried:

var fs = require('fs')

var log = fs.createWriteStream('test.log')
for (var i = 0; i < 1000000; i++) {
    console.log(i)
    log.write(i + '\n')
}

Run for a few seconds, hit ^C, leaves a 0-byte file.

rwallace
  • 31,405
  • 40
  • 123
  • 242
  • Can you share the code for each part you have tried, the question is vague without really showing how have you used writeStream, winston etc. – Abhishek Apr 26 '17 at 20:18
  • `createWriteStream` should work for this case. What is the return value of the `write` call? If it returns `false`, you should do further writes inside the `drain` handler, or it just keeps buffering your writes to memory. Also, are you writing inside a loop without releasing CPU? Posting some code would be helpful. – Vasan Apr 26 '17 at 20:19
  • @Abhishek Okay, added specific code for createWriteStream. – rwallace Apr 26 '17 at 20:36
  • @Vasan Writing inside a loop without releasing CPU, yes, that's what my program does, it's a long-running computational workload, it's not a Web server or anything like that. Further writes inside the drain handler - I'm not sure what that means, there isn't a drain handler, if there were, the rest of the program wouldn't be located inside it? – rwallace Apr 26 '17 at 20:38
  • @rwallace in this case, you should try listening on `SIGTERM` to close resources. https://nodejs.org/api/process.html#process_signal_events – Churro Apr 26 '17 at 20:38
  • @Churro Okay, tried `process.on('SIGTERM', () => log.close())` but still ends up with a 0-byte file. What should I be doing instead? – rwallace Apr 26 '17 at 20:47
  • 1
    https://github.com/nodejs/node/issues/6456 – Vlad Holubiev Apr 26 '17 at 20:58
  • @VladHolubiev if I understand correctly that link says it's not just particular stuff I've been doing, it applies to everything in node, including stdout, and that workaround is for stdout albeit not files? okay, thanks. – rwallace Apr 26 '17 at 21:03

4 Answers4

11

Turns out Node provides a lower level file I/O API that seems to work fine!

var fs = require('fs')

var log = fs.openSync('test.log', 'w')
for (var i = 0; i < 100000; i++) {
    console.log(i)
    fs.writeSync(log, i + '\n')
}
rwallace
  • 31,405
  • 40
  • 123
  • 242
6

NodeJS doesn't work in the traditional way. It uses a single thread, so by running a large loop and doing I/O inside, you aren't giving it a chance (i.e. releasing the CPU) to do other async operations for eg: flushing memory buffer to actual file.

The logic must be - do one write, then pass your function (which invokes the write) as a callback to process.nextTick or as callback to the write stream's drain event (if buffer was full during last write).

Here's a quick and dirty version which does what you need. Notice that there are no long-running loops or other CPU blockage, instead I schedule my subsequent writes for future and return quickly, momentarily freeing up the CPU for other things.

var fs = require('fs')

var log = fs.createWriteStream('test.log');
var i = 0;

function my_write() {
if (i++ < 1000000)
{
    var res = log.write("" + i + "\r\n");
    if (!res) {
        log.on('drain',my_write);
    } else {
        process.nextTick(my_write);
    }
    console.log("Done" + i + " " + res + "\r\n");
}
}

my_write();
Vasan
  • 4,810
  • 4
  • 20
  • 39
  • Okay, so you're saying I need to abandon Node.js and use a different language. While I can't be certain that's not the case, it seems overly pessimistic for something that otherwise seems to work well. The grass is not green on the other side; other languages have issues worse than this one seems. – rwallace Apr 26 '17 at 20:50
  • 2
    I am not saying that. I've used node for the exact purpose you're using it now. It just needs a slightly different mode of thinking. Let me add some code to explain better. – Vasan Apr 26 '17 at 20:53
  • 1
    @rwallace I've added a code sample - please see if it helps your understanding. – Vasan Apr 26 '17 at 21:20
  • Okay, I'll probably end up just doing without, but thanks for the suggestions anyway, upvoted. – rwallace Apr 27 '17 at 01:44
  • 1
    Inside an asynchronous function, you should be able to do `log.write(...) || await new Promise(resolve => log.once('drain',resolve));` to keep a one-liner for the write operation. – FremyCompany Mar 19 '19 at 14:14
3

This function might also be helpful.

/**
 * Write `data` to a `stream`. if the buffer is full will block
 * until it's flushed and ready to be written again.
 * [see](https://nodejs.org/api/stream.html#stream_writable_write_chunk_encoding_callback)
 */
export function write(data, stream) {
  return new Promise((resolve, reject) => {
    if (stream.write(data)) {
      process.nextTick(resolve);
    } else {
      stream.once("drain", () => {
        stream.off("error", reject);
        resolve();
      });
      stream.once("error", reject);
    }
  });
}
Safareli
  • 842
  • 10
  • 18
  • On node 14.7 does not seem to flush the output until calling `stream.close()`, as it was supposed to do. Nothing changing using `on` instead of `once` for the `drain` event. – loretoparisi Sep 26 '22 at 19:42
-1

You are writing into file using for loop which is bad but that's other case. First of all createWriteStream doesn't close the file automatically you should call close. If you call close immediately after for loop it will close without writing because it's async.

For more info read here: https://nodejs.org/api/fs.html#fs_fs_createwritestream_path_options

Problem is async function inside for loop.

Abhishek
  • 5,649
  • 3
  • 23
  • 42
  • 1
    That documentation says auto-close defaults to true? – rwallace Apr 26 '17 at 20:45
  • Yes, but you should call close. As your script terminates so quickly you don't see writes into file. Try something like setTimeout for 5 sec at end of your script. – Abhishek Apr 26 '17 at 20:47