7

I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so

  var fs = require('fs');
  var JSONStream = require('JSONStream');
  var st = JSONStream.stringifyObject()
             .pipe(fs.createWriteStream('./output_file.js'))

  st.write(large_object);

When I try this I get an error:

stream.js:94
  throw er; // Unhandled stream error in pipe.
        ^
TypeError: Invalid non-string/buffer chunk
    at validChunk (_stream_writable.js:153:14)
    at WriteStream.Writable.write (_stream_writable.js:182:12)

So apparently I cant just write an object to this stringifyObject. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject

kevzettler
  • 4,783
  • 15
  • 58
  • 103

2 Answers2

3

JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.

var fs = require('fs-extra')
var file =   '/tmp/this/path/does/not/exist/file.txt'

fs.outputJson(file, {name: 'JP'},   function (err) {
  console.log(err) // => null
});

That will write the JSON.

If you want to use JSONStream you could do something like this:

var fs = require('fs');                          
var jsonStream = require('JSONStream');          

var fl = fs.createWriteStream('dat.json');       

var out = jsonStream.stringifyObject();          
out.pipe(fl);                                    

obj = { test:10, ok: true };                                    
for (key in obj) out.write([key, obj[key]]);                                                                                
out.end();
Jason Livesay
  • 6,317
  • 3
  • 25
  • 31
  • 3
    Your first suggestion leads to `FATAL ERROR: JS Allocation failed - process out of memory` – kevzettler Sep 06 '15 at 22:32
  • I just changed the second one to be exact code for your situation unless large is an array. Try that. – Jason Livesay Sep 06 '15 at 22:40
  • I tried the second version however I have a large nested object as one of the `obj[key]` values thats thats whats throwing the memory allocation error. I'd need something similliar that is recursive for child objects – kevzettler Sep 06 '15 at 22:50
  • 1
    @kevzettler May I ask how did you solve it with recursion? – adrai Oct 19 '17 at 15:33
  • 1
    I've created a gist that streams the json with a TransformStream: https://gist.github.com/adrai/713b298fd83da0063910aa9f1674a5ed – adrai Oct 20 '17 at 06:50
1

Well the question is quite old but still valid for nowadays, I faced same issue but solved it using this JsonStreamStringify package.

const { JsonStreamStringify } = require("json-stream-stringify");

Now,

x = new JsonStreamStringify(cursor).pipe(res);
x.on("data", (doc) => {
    res.write(doc);
  });

Here you can read your file using fs and then write the above code. 'cursor' will be pointing to your file.

In this way, you can stream your file in valid JSON Format.

For Docs: https://www.npmjs.com/package/json-stream-stringify