754

I am trying to append a string to a log file. However writeFile will erase the content each time before writing the string.

fs.writeFile('log.txt', 'Hello Node', function (err) {
  if (err) throw err;
  console.log('It\'s saved!');
}); // => message.txt erased, contains only 'Hello Node'

Any idea how to do this the easy way?

Vipul Patil
  • 1,250
  • 15
  • 27
supercobra
  • 15,810
  • 9
  • 45
  • 51

18 Answers18

1144

For occasional appends, you can use appendFile, which creates a new file handle each time it's called:

Asynchronously:

const fs = require('fs');

fs.appendFile('message.txt', 'data to append', function (err) {
  if (err) throw err;
  console.log('Saved!');
});

Synchronously:

const fs = require('fs');

fs.appendFileSync('message.txt', 'data to append');

But if you append repeatedly to the same file, it's much better to reuse the file handle.

Dan Dascalescu
  • 143,271
  • 52
  • 317
  • 404
denysonique
  • 16,235
  • 6
  • 37
  • 40
  • 74
    Does anyone know if **fs.appendFile** keeps a link to the file open so appends are faster? (rather than open/close each write) http://nodejs.org/api/fs.html#fs_fs_appendfile_filename_data_encoding_utf8_callback – nelsonic Oct 30 '12 at 14:16
  • 6
    @nelsonic According to the source code, there is no particular treatment. – Maël Nison Aug 04 '14 at 09:51
  • 13
    In case it's handy: Note that this is async. This can result in weird timing and other things. Ex: if you have `process.exit()` just after `fs.appendFile`, you may exit before the output is sent. (Using `return` is fine.) – SilentSteel Aug 15 '14 at 21:35
  • @SilentSteel I am seeing some wired behavior where not everything is getting appended to the file. This is at high volume where I am trying to append every 1ms. Do you know what I should be using instead? – gumenimeda Nov 19 '14 at 16:54
  • 9
    Worse case, you can use the synchronous version, `appendFileSync`. http://nodejs.org/api/fs.html#fs_fs_appendfilesync_filename_data_options But you may lose one of the great benefits of Node, which is async operations. Make sure you catch errors. Perhaps on some OSes, you can get access denied if requesting the file handle at the same time. Not sure about that. – SilentSteel Dec 01 '14 at 03:30
  • 2
    Also, this creates a file if it doesn't exist. Docs say: `Asynchronously append data to a file, creating the file if it not yet exists. data can be a string or a buffer.` (http://nodejs.org/api/fs.html#fs_fs_appendfile_filename_data_options_callback) – Ben Feb 26 '15 at 04:12
  • 3
    Do not use `fs.appendFile`. It is a broken API which uses a new file descriptor for each call. On a busy server (hundreds of `appendFile` per second) this can result in all 1024 file descriptors being used up and the process not being able to open other files. – fadedbee Dec 14 '15 at 09:28
  • 7
    @chrisdew Thanks for the update.. but... if we are not to use the accepted answer here, what are we supposed to do? How did you solve this dilema? – zipzit Mar 07 '16 at 10:33
  • hi How to add each data with new line ? – questionasker Feb 25 '17 at 06:20
  • Note that it won't automatically append a newline character, so if it's for something like a log file, a "\n" needs to be manually added at the end. – laurent Jun 23 '17 at 17:26
  • 4
    @zipzit you asked chrisdew (which I assume is now fadedbee) for an alternative to fs.appendFile. How about the answer from Plaute? It says fs.createWriteStream("append.txt", {flags:'a'}).write(myText); – Marcus Jan 05 '18 at 15:43
  • @fadedbee and @zipzit - old comment, but invaluable. I had a large loop with appends and noticed the thing would get exponentially slower. Best way is to `createWriteStream` with `flags : 'a'` and use subsequent `.write` for appending. I believe you can use `await .write` to make sure the appends are sequential. – noderman Jun 26 '20 at 19:44
  • @nelsonic to append to a file continously you can use stream writer but you'll need to open the file with the 'a' flag. `fs.createStreamWriter('file', {flags: 'a'})` and you can write like you would normally do with a stream writer. – Nelson J Perez Apr 15 '23 at 21:44
397

When you want to write in a log file, i.e. appending data to the end of a file, never use appendFile. appendFile opens a file handle for each piece of data you add to your file, after a while you get a beautiful EMFILE error.

I can add that appendFile is not easier to use than a WriteStream.

Example with appendFile:

console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
    fs.appendFile("append.txt", index+ "\n", function (err) {
        if (err) console.log(err);
    });
});
console.log(new Date().toISOString());

Up to 8000 on my computer, you can append data to the file, then you obtain this:

{ Error: EMFILE: too many open files, open 'C:\mypath\append.txt'
    at Error (native)
  errno: -4066,
  code: 'EMFILE',
  syscall: 'open',
  path: 'C:\\mypath\\append.txt' }

Moreover, appendFile will write when it is enabled, so your logs will not be written by timestamp. You can test with example, set 1000 in place of 100000, order will be random, depends on access to file.

If you want to append to a file, you must use a writable stream like this:

var stream = fs.createWriteStream("append.txt", {flags:'a'});
console.log(new Date().toISOString());
[...Array(10000)].forEach( function (item,index) {
    stream.write(index + "\n");
});
console.log(new Date().toISOString());
stream.end();

You end it when you want. You are not even required to use stream.end(), default option is AutoClose:true, so your file will end when your process ends and you avoid opening too many files.

davnicwil
  • 28,487
  • 16
  • 107
  • 123
Plaute
  • 4,669
  • 2
  • 17
  • 19
  • 6
    Thanks for the great answer, but my doubt is that due to asynchronous nature of Javascript, it will execute `stream.end()` before the `stream.write()`, so we shouldn't use `stream.end()`, also as you mentioned that `AutoClose:True` is a default option then why bother writing a line which is of no use. – Aashish Kumar May 07 '18 at 19:08
  • 31
    `due to asynchronous nature of Javascript`... What? Array.forEach is a synchronous operation. JS is synchronous. It just happens to provide some ways to manage asynchronous operations, like Promises and async/await. – Sharcoux Nov 28 '18 at 13:10
  • 9
    I'd guess `fs.appendFile` result in too many open files because you execute it in asynchronous manner (you're just asynchronously creating 10000 file handle), I believe `appendFileSync` would not have similar problem, also not `fs.appendFile` with proper interval (1s is probably more than enough) or queueing. – apple apple May 22 '19 at 16:27
  • 2
    @appleapple But you're still opening the file each time. For a log, it makes much more sense to keep it open. – Radvylf Programs Oct 10 '19 at 14:04
  • 4
    @RedwolfPrograms For busy server log, maybe true. For one-time per execution log, maybe not. Anyway, I just state that the point (at least the reason) in this answer is not correct. – apple apple Oct 11 '19 at 07:44
  • 4
    For logs I would just use `fs.appendFileSync` (it can run a million times without issues). If the script crashes or exits right after you log something with `stream.write` or `fs.appendFile`, it is likely that you won't get the log written (which most likely will be needed to debug the crash/exit) – Dan Oct 22 '19 at 18:55
159

Your code using createWriteStream creates a file descriptor for every write. log.end is better because it asks node to close immediately after the write.

var fs = require('fs');
var logStream = fs.createWriteStream('log.txt', {flags: 'a'});
// use {flags: 'a'} to append and {flags: 'w'} to erase and write a new file
logStream.write('Initial line...');
logStream.end('this is the end line');
Software Engineer
  • 15,457
  • 7
  • 74
  • 102
Fabio Ferrari
  • 2,081
  • 1
  • 14
  • 4
  • 6
    missing first line! should be 'var fs = require('fs');' – Stormbytes Apr 09 '15 at 04:10
  • 6
    Or perhaps even better `var fs = require('graceful-fs')`, which hashed out some known problems. See the [docs](https://github.com/isaacs/node-graceful-fs) for more info. – Marko Bonaci May 27 '15 at 13:16
  • 3
    Both the initial and end line are on the same line though :-p – binki Dec 13 '16 at 06:25
  • 8
    **Please note**: If you are using `fs.createWriteStream`, then use `flags`. If you are using `fs.writeFile` then it's `flag`. Please refer [Node JS Docs - File System](https://nodejs.org/api/fs.html) for more information. – Anish Nair Aug 09 '17 at 06:39
  • 2
    Be careful! The parameter is not "flags" but "flag" (singular): https://nodejs.org/api/fs.html#fs_fs_writefile_file_data_options_callback – Benny Code Feb 24 '18 at 15:31
  • 4
    @BennyNeugebauer the use of flags is correct, you are not. It was already posted many months before you You've linked docs to fs.writeFile which does use 'flag'. But this solution uses fs.createWriteStream and the parameter 'flags' is correct - https://nodejs.org/api/fs.html#fs_fs_createwritestream_path_options – Qwiso Apr 30 '19 at 12:34
41

Use a+ flag to append and create a file (if doesn't exist):

fs.writeFile('log.txt', 'Hello Node', { flag: "a+" }, (err) => {
  if (err) throw err;
  console.log('The file is created if not existing!!');
}); 

Docs: https://nodejs.org/api/fs.html#fs_file_system_flags

t_dom93
  • 10,226
  • 1
  • 52
  • 38
37

Besides appendFile, you can also pass a flag in writeFile to append data to an existing file.

fs.writeFile('log.txt', 'Hello Node',  {'flag':'a'},  function(err) {
    if (err) {
        return console.error(err);
    }
});

By passing flag 'a', data will be appended at the end of the file.

A J
  • 3,970
  • 14
  • 38
  • 53
  • 5
    **Please note**: If you are using `fs.createWriteStream`, then use `flags`. If you are using `fs.writeFile` then it's `flag`. Please refer [Node JS Docs - File System](https://nodejs.org/api/fs.html) for more information. – Anish Nair Aug 09 '17 at 06:42
27

You need to open it, then write to it.

var fs = require('fs'), str = 'string to append to file';
fs.open('filepath', 'a', 666, function( e, id ) {
  fs.write( id, 'string to append to file', null, 'utf8', function(){
    fs.close(id, function(){
      console.log('file closed');
    });
  });
});

Here's a few links that will help explain the parameters

open
write
close


EDIT: This answer is no longer valid, look into the new fs.appendFile method for appending.

Corey Hart
  • 10,316
  • 9
  • 41
  • 47
17

My approach is rather special. I basically use the WriteStream solution but without actually 'closing' the fd by using stream.end(). Instead I use cork/uncork. This got the benefit of low RAM usage (if that matters to anyone) and I believe it's more safe to use for logging/recording (my original use case).

Following is a pretty simple example. Notice I just added a pseudo for loop for showcase -- in production code I am waiting for websocket messages.

var stream = fs.createWriteStream("log.txt", {flags:'a'});
for(true) {
  stream.cork();
  stream.write("some content to log");
  process.nextTick(() => stream.uncork());
}

uncork will flush the data to the file in the next tick.

In my scenario there are peaks of up to ~200 writes per second in various sizes. During night time however only a handful writes per minute are needed. The code is working super reliable even during peak times.

Guido
  • 46,642
  • 28
  • 120
  • 174
Tom-Oliver Heidel
  • 1,011
  • 9
  • 25
16

Using fs.appendFile or fsPromises.appendFile are the fastest and the most robust options when you need to append something to a file.

In contrast to some of the answers suggested, if the file path is supplied to the appendFile function, It actually closes by itself. Only when you pass in a filehandle that you get by something like fs.open() you have to take care of closing it.

I tried it with over 50,000 lines in a file.

Examples :

(async () => {
  // using appendFile.
  const fsp = require('fs').promises;
  await fsp.appendFile(
    '/path/to/file', '\r\nHello world.'
  );

  // using apickfs; handles error and edge cases better.
  const apickFileStorage = require('apickfs');
  await apickFileStorage.writeLines(
    '/path/to/directory/', 'filename', 'Hello world.'
  );
})();

enter image description here

Ref: https://github.com/nodejs/node/issues/7560

vivek agarwal
  • 435
  • 4
  • 6
14

Node.js 0.8 has fs.appendFile:

fs.appendFile('message.txt', 'data to append', (err) => {
  if (err) throw err;
  console.log('The "data to append" was appended to file!');
});

Documentation

chbrown
  • 11,865
  • 2
  • 52
  • 60
7

If you want an easy and stress-free way to write logs line by line in a file, then I recommend fs-extra:

const os = require('os');
const fs = require('fs-extra');

const file = 'logfile.txt';
const options = {flag: 'a'};

async function writeToFile(text) {
  await fs.outputFile(file, `${text}${os.EOL}`, options);
}

writeToFile('First line');
writeToFile('Second line');
writeToFile('Third line');
writeToFile('Fourth line');
writeToFile('Fifth line');

Tested with Node v8.9.4.

Benny Code
  • 51,456
  • 28
  • 233
  • 198
5
fd = fs.openSync(path.join(process.cwd(), 'log.txt'), 'a')
fs.writeSync(fd, 'contents to append')
fs.closeSync(fd)
Luis R.
  • 826
  • 9
  • 13
  • 5
    anything sync() is almost always a bad idea unless you're 100% sure you absolutely NEED it. Even then, you're probably doing it wrong. – Zane Claes Oct 14 '12 at 00:45
  • 5
    Doesn't mean it's wrong. It just does it Synchronously. Might not be best practice for Node.js, but it's supported. – Luis R. Nov 17 '12 at 20:43
  • 2
    I was using "ur doin it wrong" in the colloquial internet-meme sense of the phrase. Obviously it's supported =P – Zane Claes Nov 17 '12 at 21:52
  • 1
    @LuisR. The whole point in using node is to do things asynchronously, allowing the processes to happen in the background while freeing the internal event loop to handle other requests (if it's a server) or other tasks. – Patrick Roberts Jul 15 '13 at 03:32
  • 8
    Agreed on async, but sometimes if you're just writing an interactive script, sync is fine. – bryanmac Mar 13 '14 at 12:55
  • 8
    Writing synchronously is absolutely ok if you are doing single user command line app (e.g. script to do some stuff). That way it is faster to do stuff. Why would node have sync methods if not for this purpose? – Jan Święcki May 16 '14 at 13:06
  • fdW = fsW.openSync("", './log.txt'), 'a') ^ SyntaxError: Unexpected token ) – Gank Jul 14 '15 at 00:30
3

I offer this suggestion only because control over open flags is sometimes useful, for example, you may want to truncate it an existing file first and then append a series of writes to it - in which case use the 'w' flag when opening the file and don't close it until all the writes are done. Of course appendFile may be what you're after :-)

  fs.open('log.txt', 'a', function(err, log) {
    if (err) throw err;
    fs.writeFile(log, 'Hello Node', function (err) {
      if (err) throw err;
      fs.close(log, function(err) {
        if (err) throw err;
        console.log('It\'s saved!');
      });
    });
  });
balrob
  • 585
  • 6
  • 7
3

Try to use flags: 'a' to append data to a file

 var stream = fs.createWriteStream("udp-stream.log", {'flags': 'a'});
  stream.once('open', function(fd) {
    stream.write(msg+"\r\n");
  });
Codemaker2015
  • 12,190
  • 6
  • 97
  • 81
2

Using jfile package :

myFile.text+='\nThis is new line to be appended'; //myFile=new JFile(path);
Abdennour TOUMI
  • 87,526
  • 38
  • 249
  • 254
0

Here's a full script. Fill in your file names and run it and it should work! Here's a video tutorial on the logic behind the script.

var fs = require('fs');

function ReadAppend(file, appendFile){
  fs.readFile(appendFile, function (err, data) {
    if (err) throw err;
    console.log('File was read');

    fs.appendFile(file, data, function (err) {
      if (err) throw err;
      console.log('The "data to append" was appended to file!');

    });
  });
}
// edit this with your file names
file = 'name_of_main_file.csv';
appendFile = 'name_of_second_file_to_combine.csv';
ReadAppend(file, appendFile);
Jvieitez
  • 91
  • 7
0
const inovioLogger = (logger = "") => {
    const log_file = fs.createWriteStream(__dirname + `/../../inoviopay-${new Date().toISOString().slice(0, 10)}.log`, { flags: 'a' });
    const log_stdout = process.stdout;
    log_file.write(logger + '\n');
}
eyllanesc
  • 235,170
  • 19
  • 170
  • 241
sunilsingh
  • 503
  • 1
  • 5
  • 9
0

In addition to denysonique's answer, sometimes asynchronous type of appendFile and other async methods in NodeJS are used where promise returns instead of callback passing. To do it you need to wrap the function with promisify HOF or import async functions from promises namespace:

const { appendFile } = require('fs').promises;

await appendFile('path/to/file/to/append', dataToAppend, optionalOptions);

I hope it'll help

Alex Gusev
  • 787
  • 7
  • 9
0

I wrapped the async fs.appendFile into a Promise-based function. Hope it helps others to see how this would work.

    append (path, name, data) {

        return new Promise(async (resolve, reject) => {

            try {

                fs.appendFile((path + name), data, async (err) => {

                    if (!err) {

                        return resolve((path + name));

                    } else {

                        return reject(err);

                    }

                });

            } catch (err) {

                return reject(err);

            }

        });

    }
Nick
  • 466
  • 1
  • 6
  • 12