1

I know there are many similar question but they all are reading from file.

In my case I already have the object in the memory (>500MB), fetched from Database.

when I use native nodejs Stringify

JSON.stringify(BigFatObject);

It thorw Invalid String length error. Which I assume because of big object size.

My End goal is to convert javascript to JSON String and later save it to aws s3 storage. Time is not a big deal but memory resource is.

I tried JSONStream module. But examples are for read from a file and not when object is already in memory.

kmanish75
  • 19
  • 2
  • 7
  • This probably depends on how much your CPU and RAM can handle. Likely 500MB was near the limit, and the JSON stringify action only used some more memory, causing your computer to explode. See https://stackoverflow.com/questions/29175877/json-stringify-throws-rangeerror-invalid-string-length-for-huge-objects. – code Mar 21 '22 at 05:30
  • did you try this? https://www.npmjs.com/package/big-json – dangerousmanleesanghyeon Mar 21 '22 at 05:31
  • I'm going to assume the way you get it this big is that there's a big array in there somewhere. Perhaps break up the array into pieces and stringify and store the pieces separately. Other than that, large structured data probably belongs in a different type of storage than one giant JSON string. Even if you do succeed in storing it, it will be hard to read, parse and use that way too. – jfriend00 Mar 21 '22 at 05:44
  • i suggest you to read these comments and update your question with more information. try explain where you get the data from and what its structure is… – adir abargil Mar 21 '22 at 05:51
  • For giant arrays of data, .csv is much easier to deal with very large sets of data because it can be streamed in and out of disk easily one line at a time. It's also easier to append new data to it. – jfriend00 Mar 21 '22 at 06:03

0 Answers0