188

I need to pass in a text file in the terminal and then read the data from it, how can I do this?

node server.js file.txt

How do I pass in the path from the terminal, how do I read that on the other side?

Remi Guan
  • 21,506
  • 17
  • 64
  • 87
fancy
  • 48,619
  • 62
  • 153
  • 231
  • If you find yourself adding more options on the command line, you can use [Optimist](https://github.com/substack/node-optimist). – Jess Jan 17 '14 at 17:05
  • http://stackoverflow.com/questions/6156501/read-a-file-one-line-at-a-time-in-node-js?rq=1 shows another way to read a text file – Marc Durdin May 17 '17 at 06:13

6 Answers6

218

You'll want to use the process.argv array to access the command-line arguments to get the filename and the FileSystem module (fs) to read the file. For example:

// Make sure we got a filename on the command line.
if (process.argv.length < 3) {
  console.log('Usage: node ' + process.argv[1] + ' FILENAME');
  process.exit(1);
}
// Read the file and print its contents.
var fs = require('fs')
  , filename = process.argv[2];
fs.readFile(filename, 'utf8', function(err, data) {
  if (err) throw err;
  console.log('OK: ' + filename);
  console.log(data)
});

To break that down a little for you process.argv will usually have length two, the zeroth item being the "node" interpreter and the first being the script that node is currently running, items after that were passed on the command line. Once you've pulled a filename from argv then you can use the filesystem functions to read the file and do whatever you want with its contents. Sample usage would look like this:

$ node ./cat.js file.txt
OK: file.txt
This is file.txt!

[Edit] As @wtfcoder mentions, using the "fs.readFile()" method might not be the best idea because it will buffer the entire contents of the file before yielding it to the callback function. This buffering could potentially use lots of memory but, more importantly, it does not take advantage of one of the core features of node.js - asynchronous, evented I/O.

The "node" way to process a large file (or any file, really) would be to use fs.read() and process each available chunk as it is available from the operating system. However, reading the file as such requires you to do your own (possibly) incremental parsing/processing of the file and some amount of buffering might be inevitable.

maerics
  • 151,642
  • 46
  • 269
  • 291
  • Awesome, thanks so much, very helpful. How could I split this data by line? – fancy Feb 06 '12 at 23:45
  • 12
    @fancy: try `var lines = data.split(/\r?\n/);`, then the array "lines" will have each line. – maerics Feb 06 '12 at 23:46
  • 1
    This isnt a good idea if the text file is large, as it will be all be read into memory, if you process say a 1000mb CSV file look at fs.createFilestream, you will need to take care with line splitting though as the data chunks wont (in most cases) fall on the line boundaries (some people have already came up with solutions - google) – Mâtt Frëëman Feb 07 '12 at 03:45
  • 1
    @wtfcoder: yes, very good point. My intent was just to demonstrate the simple case of reading a file named on the command-line; there are obviously many subtleties (esp. performance) that are beyond the scope of this question. – maerics Feb 07 '12 at 05:33
  • I posted a solution to a similar question for parsing a very large file, using a stream, synchronous. see: http://stackoverflow.com/questions/16010915/parsing-huge-logfiles-in-node-js-read-in-line-by-line/23695940#23695940 – Gerard May 16 '14 at 13:25
99

Usign fs with node.

var fs = require('fs');

try {  
    var data = fs.readFileSync('file.txt', 'utf8');
    console.log(data.toString());    
} catch(e) {
    console.log('Error:', e.stack);
}
Ronald Coarite
  • 4,460
  • 27
  • 31
  • 3
    Note that this is the _synchronous_ version. – Rich Werden Dec 26 '18 at 16:55
  • @RichWerden what do you mean by "synchronous" in this context? – Json Apr 12 '19 at 16:04
  • 6
    In Node when something is "synchronous" it stops/blocks the system from doing anything else. Let's say you have a node webserver - if any other requests come in while the above is happening, the server won't/can't respond because it is busy reading the file. – Rich Werden Apr 12 '19 at 21:38
29

IMHO, fs.readFile() should be avoided because it loads ALL the file in memory and it won't call the callback until all the file has been read.

The easiest way to read a text file is to read it line by line. I recommend a BufferedReader:

new BufferedReader ("file", { encoding: "utf8" })
    .on ("error", function (error){
        console.log ("error: " + error);
    })
    .on ("line", function (line){
        console.log ("line: " + line);
    })
    .on ("end", function (){
        console.log ("EOF");
    })
    .read ();

For complex data structures like .properties or json files you need to use a parser (internally it should also use a buffered reader).

Gabriel Llamas
  • 18,244
  • 26
  • 87
  • 112
  • 7
    Thanks for pointing out this technique. You're right that this may be the best way, but I just thought it was slightly confusing in the context of this question, which I think is asking about an undemanding use-case. As pointed out above, if it's just a small file being passed to a command line tool, there's no reason not to use `fs.readFile()` or `fs.readFileSync()`. It's got to be a huge file to cause a noticeable wait. A JSON config file like package.json is likely to be under 1 kb, so you can just `fs.readFile()` and `JSON.parse()` it. – John Starr Dewar Mar 18 '13 at 23:47
  • 1
    BufferedReader may have changed its signature. I had to replace BufferedReader with BufferedReader,DataReader, where BufferedReader was the module. See https://github.com/Gagle/Node-BufferedReader – bnieland Jan 28 '16 at 18:41
  • 15
    I see that BufferedReader is now deprecated. – Marc Rochkind Aug 22 '16 at 14:34
11

You can use readstream and pipe to read the file line by line without read all the file into memory one time.

var fs = require('fs'),
    es = require('event-stream'),
    os = require('os');

var s = fs.createReadStream(path)
    .pipe(es.split())
    .pipe(es.mapSync(function(line) {
        //pause the readstream
        s.pause();
        console.log("line:", line);
        s.resume();
    })
    .on('error', function(err) {
        console.log('Error:', err);
    })
    .on('end', function() {
        console.log('Finish reading.');
    })
);
LF00
  • 27,015
  • 29
  • 156
  • 295
8

I am posting a complete example which I finally got working. Here I am reading in a file rooms/rooms.txt from a script rooms/rooms.js

var fs = require('fs');
var path = require('path');
var readStream = fs.createReadStream(path.join(__dirname, '../rooms') + '/rooms.txt', 'utf8');
let data = ''
readStream.on('data', function(chunk) {
    data += chunk;
}).on('end', function() {
    console.log(data);
});
iamnotsam
  • 9,470
  • 7
  • 33
  • 30
-2

The async way of life:

#! /usr/bin/node

const fs = require('fs');

function readall (stream)
{
  return new Promise ((resolve, reject) => {
    const chunks = [];
    stream.on ('error', (error) => reject (error));
    stream.on ('data',  (chunk) => chunk && chunks.push (chunk));
    stream.on ('end',   ()      => resolve (Buffer.concat (chunks)));
  });
}

function readfile (filename)
{
  return readall (fs.createReadStream (filename));
}

(async () => {
  let content = await readfile ('/etc/ssh/moduli').catch ((e) => {})
  if (content)
    console.log ("size:", content.length,
                 "head:", content.slice (0, 46).toString ());
})();
ceving
  • 21,900
  • 13
  • 104
  • 178