111

I've searched all over stackoverflow / google for this, but can't seem to figure it out.

I'm scraping social media links of a given URL page, and the function returns an object with a list of URLs.

When I try to write this data into a different file, it outputs to the file as [object Object] instead of the expected: [ 'https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks'] as it does when I console.log() the results.

This is my sad attempt to read and write a file in Node, trying to read each line(the url) and input through a function call request(line, gotHTML):

fs.readFileSync('./urls.txt').toString().split('\n').forEach(function (line){
    console.log(line); 
    var obj = request(line, gotHTML); 
    console.log(obj); 
    fs.writeFileSync('./data.json', obj , 'utf-8'); 
});   

for reference -- the gotHTML function:

function gotHTML(err, resp, html){ 
    var social_ids = []; 

    if(err){
        return console.log(err); 
    } else if (resp.statusCode === 200){ 
        var parsedHTML = $.load(html); 

        parsedHTML('a').map(function(i, link){
            var href = $(link).attr('href');
            for(var i=0; i<socialurls.length; i++){
                if(socialurls[i].test(href) && social_ids.indexOf(href) < 0 ) {
                    social_ids.push(href); 
                }; 
            }; 
        })
    };

    return social_ids;
};
Massimiliano Kraus
  • 3,638
  • 5
  • 27
  • 47
sarahbkim
  • 1,133
  • 2
  • 7
  • 6
  • 3
    `[object Object]` is an object `toString`. If you want the representation of the object, use `JSON.stringify`. – elclanrs Feb 24 '14 at 00:12
  • 2
    Be careful with `JSON.stringify`. With arrays, you're safe, but when objects have circular references it will fail ([see this topic](http://stackoverflow.com/a/11616993/151445)). The `util` module handles circular references. – Joseph Yaduvanshi Feb 24 '14 at 15:02

7 Answers7

126

Building on what deb2fast said I would also pass in a couple of extra parameters to JSON.stringify() to get it to pretty format:

fs.writeFileSync('./data.json', JSON.stringify(obj, null, 2) , 'utf-8');

The second param is an optional replacer function which you don't need in this case so null works.

The third param is the number of spaces to use for indentation. 2 and 4 seem to be popular choices.

Florian Neumann
  • 5,587
  • 1
  • 39
  • 48
Guy
  • 65,082
  • 97
  • 254
  • 325
85

obj is an array in your example.

fs.writeFileSync(filename, data, [options]) requires either String or Buffer in the data parameter. see docs.

Try to write the array in a string format:

// writes 'https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks'
fs.writeFileSync('./data.json', obj.join(',') , 'utf-8'); 

Or:

// writes ['https://twitter.com/#!/101Cookbooks', 'http://www.facebook.com/101cookbooks']
var util = require('util');
fs.writeFileSync('./data.json', util.inspect(obj) , 'utf-8');

edit: The reason you see the array in your example is because node's implementation of console.log doesn't just call toString, it calls util.format see console.js source

Joseph Yaduvanshi
  • 20,241
  • 5
  • 61
  • 69
  • 9
    If your object contains large arrays, this will not work as the inspect method will output something like "45000 more...]" – Coxer Feb 16 '17 at 16:26
23

If you're geting [object object] then use JSON.stringify

fs.writeFile('./data.json', JSON.stringify(obj) , 'utf-8');

It worked for me.

Fatih Acet
  • 28,690
  • 9
  • 51
  • 58
deb2fast
  • 990
  • 11
  • 18
12

In my experience JSON.stringify is slightly faster than util.inspect. I had to save the result object of a DB2 query as a json file, The query returned an object of 92k rows, the conversion took very long to complete with util.inspect, so I did the following test by writing the same 1000 record object to a file with both methods.

  1. JSON.Stringify

    fs.writeFile('./data.json', JSON.stringify(obj, null, 2));
    

Time: 3:57 (3 min 57 sec)

Result's format:

[
  {
    "PROB": "00001",
    "BO": "AXZ",
    "CNTRY": "649"
   },
  ...
]
  1. util.inspect

    var util = require('util');
    fs.writeFile('./data.json', util.inspect(obj, false, 2, false));
    

Time: 4:12 (4 min 12 sec)

Result's format:

[ { PROB: '00001',
    BO: 'AXZ',
    CNTRY: '649' },
    ...
]
Netsmile
  • 171
  • 1
  • 6
4

Could you try doing JSON.stringify(obj);

Like this:

var stringify = JSON.stringify(obj);
fs.writeFileSync('./data.json', stringify, 'utf-8'); 
bdkopen
  • 494
  • 1
  • 6
  • 16
2

Just incase anyone else stumbles across this, I use the fs-extra library in node and write javascript objects to a file like this:

const fse = require('fs-extra');
fse.outputJsonSync('path/to/output/file.json', objectToWriteToFile); 
tljw86
  • 31
  • 4
0

Further to @Jim Schubert's and @deb2fast's answers:

To be able to write out large objects of order which are than ~100 MB, you'll need to use for...of as shown below and match to your requirements.

const fsPromises = require('fs').promises;

const sampleData = {firstName:"John", lastName:"Doe", age:50, eyeColor:"blue"};

const writeToFile = async () => {
  for (const dataObject of Object.keys(sampleData)) {
      console.log(sampleData[dataObject]);
      await fsPromises.appendFile( "out.json" , dataObject +": "+ JSON.stringify(sampleData[dataObject]));
  }
}

writeToFile();

Refer https://stackoverflow.com/a/67699911/3152654 for full reference for node.js limits

imsheth
  • 31
  • 2
  • 18
  • 36