63

As the title implies I'm trying to stringify huge JavaScript Object with JSON.stringify in my Node.js app. The objects are - again - huge (tens of mega bytes), they don't contain any functions. I need to write the serialized objects to a file. What I'm getting now is this:

RangeError: Invalid string length
  at Object.stringify (native)
  at stringifyResult (/my/file.js:123:45) -> line where I use JSON.stringify

Any idea how to solve that issue?

borisdiakur
  • 10,387
  • 7
  • 68
  • 100
  • 5
    That said, if what you're doing is preparing the data structure for output, you could write your own JSON serializer that incrementally writes to an output stream instead of creating a single massive string. It wouldn't be super-easy but it wouldn't be super-hard either. – Pointy Mar 20 '15 at 21:17
  • 4
    I think there are streaming or buffered JSON de/serializers out there. –  Mar 20 '15 at 21:31
  • 1
    Looking for a similar answer, except for javascript client side (no node). Meanwhile, here's an answer to your problem, @boris: http://stackoverflow.com/questions/24153996/is-there-a-limit-on-the-size-of-a-string-in-json-with-node-js – cregox Nov 30 '15 at 09:26

4 Answers4

28

I too have seen this unhelpful/misleading nodejs error message, so I booked an issue over at nodejs github

RangeError: Invalid string length --- it should be saying Out Of Memory

Scott Stensland
  • 26,870
  • 12
  • 93
  • 104
8

As mentioned by @sandeepanu, there's a great little solution by @madhunimmo for if you're trying to stringify a huge array. Just stringify one element at a time:

let out = "[" + yourArray.map(el => JSON.stringify(el)).join(",") + "]";

If you're trying to stringify an object with a very large number of keys/properties, then you could just use Object.entries() on it first to turn it into an array of key/value pairs first:

let out = "[" + Object.entries(yourObject).map(el => JSON.stringify(el)).join(",") + "]";

If that still doesn't work, then you'll probably want to use a streaming approach, although you could slice your array into portions and store as multiple jsonl (one object per line) files:

// untested code
let numFiles = 4;
for(let i  = 0; i < numFiles; i++) {
  let out = arr.slice((i/numFiles)*arr.length, ((i+1)/numFiles)*arr.length).map(el => JSON.stringify(el)).join(",");
  // add your code to store/save `out` here
}

One streaming approach (new, and currently only supported in Chrome, but will likely come to other browsers, and even Deno and Node.js in some form or another) is to use the File System Access API. The code would look something like this:

// untested code
const dirHandle = await window.showDirectoryPicker();
const fileHandle = await dirHandle.getFileHandle('yourData.jsonl', { create: true });
const writable = await fileHandle.createWritable();
for(let el of yourArray) {
  await writable.write(JSON.stringify(el)+"\n");
}
await writable.close();
joe
  • 3,752
  • 1
  • 32
  • 41
6

I find JSONStream to be a reliable alternative to the native JSON.stringify that works well with large objects. For example:

var fileSystem = require( "fs" );
var JSONStream = require( "JSONStream" );
var records = [
    { id: 1, name: "Terminator" },
    { id: 2, name: "Predator" },
    { id: 3, name: "True Lies" },
    { id: 4, name: "Running Man" },
    { id: 5, name: "Twins" }
    // .... hundreds of thousands of records ....
];

var transformStream = JSONStream.stringify();
var outputStream = fileSystem.createWriteStream( __dirname + "/data.json" );
transformStream.pipe( outputStream );    
records.forEach( transformStream.write );
transformStream.end();

outputStream.on(
    "finish",
    function handleFinish() {
        console.log("Done");
    }
);

Took the sample code from here.

Usman Khawaja
  • 809
  • 14
  • 28
  • This approach helped me, thanks. Just one question, do this answer could be tweak to have a single line json? Because a this point for every json object we'll find a break line. – Rodolfo Velasco Apr 04 '23 at 19:04
2

Here's a simple helper file that can do the job:

const fs = require('fs');
const json = require('big-json');
 
// pojo will be sent out in JSON chunks written to the specified file name in the root 
function makeFile(filename, pojo){

    const stringifyStream = json.createStringifyStream({
        body: pojo
    });

    stringifyStream.on('data', function(strChunk) {
        fs.appendFile(filename, strChunk, function (err) {
            if (err) throw err;
        })
    });

}

module.exports = {
    makeFile
}
BuffaloDev
  • 373
  • 2
  • 11