4

Is there a way to reduce memory usage on client when converting a large javascript object to a string using JSON.stringify?

I'm looking for something that addresses the question below but for javascript on the client.

Writing JSON to a stream without buffering the string in memory

When I try a simple JSON.stringify( big_object ) it quickly takes up all the RAM and freezes my computer.

The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here.

Example of memory leak in indexedDB at store.add (see Example at Edit)

These two questions from three years ago seem to have the same problem, but I can't find that a solution was ever found.

How can I make a really long string using IndexedDB without crashing the browser?

JSON.stringify optimization

The larger question is this: in an off-line web app in which the user can accumulate a large amount of data in an indexedDB database, the process to back that up to the hard disk appears to be to write the data to an object, convert the object to a string, the string to a blob of text, and download the blob to disk. And to upload the file and write it back to the database, perform the reverse. However, the JSON.stringify and JSON.parse on the large object grabs all the memory and crashes the browser or entire computer.

This link appears to state that the large blob issue in indexedDB has been resolved, but that doesn't appear to solve this problem, does it? The object can't be directly converted to a blob, can it? And, if so, can the organized object be recovered from a blob?

Erratic IndexedDB large file store operations cause IndexedDB consume a large amount of memory that is not freed. https://bugzilla.mozilla.org/show_bug.cgi?id=1223782

Apart from having the user download and upload several files to backup and restore their work saved in the database, is there another way to accomplish this when it's all on the client and off-line?

Thank you for any direction you can provide.

Gary
  • 2,393
  • 12
  • 31
  • Large objects (>1GB) are usually split in chunks and handled in streams, not in strings. What have you tried so far? – RaphaMex Jun 02 '18 at 05:43
  • To read large files, I use [FileReader](https://developer.mozilla.org/en-US/docs/Web/API/FileReader). What do you use to write files? – RaphaMex Jun 02 '18 at 06:06
  • To write files, I use a blob and an object URL: var blob = new Blob( [data], { type: 'text/csv' } ), where data is the result of JSON.stringify; and window.URL.createObjectURL(blob). – Gary Jun 03 '18 at 01:02
  • To answer your question, in regards to impact on RAM of writing a large object to indexedDB, I simply had to write different components of the object to separate object stores and delete each component immediately after writing it. Trying to stringify before writing to database just consumed RAM at a different point. I 've tested the impact on RAM to stringify various sizes of large objects that a user of this app would need to save their database to disk. I considered making the object more one-dimensional to reduce serialization step's need for RAM, but decided to search here first. – Gary Jun 03 '18 at 01:41
  • Can the browser stream to the hard disk on the client off-line or is this streaming a server operation? Thank you. – Gary Jun 03 '18 at 01:44
  • Is there any reason why separate components of the object couldn't be stringified separately and concatentated with some type of key separater into one large string, converted to a blob, and written to disk? If the text can be read and parsed by key to grab the individual JSON strings, they could be parsed and written back to the object and database. The issue isn't object size but RAM usage to serialize. This appears to be the same concept as streaming but on client. I guess it depends on how difficult it is to read and parse the whole text file as a single string. I'll give it a try. – Gary Jun 03 '18 at 02:21

1 Answers1

1

"The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here."

indexDB has a limit of 5mb on mobile / 50mb on Desktop so if your object exceeds those device-based bounds that is your issue with indexDB

As to creating a massive sting from a massive object, you may be running up against the V8 string length limitation, which is currently limited to 512MB. So you will need to use a stream based parse / serialization like big-json

HelloWorld
  • 2,480
  • 3
  • 28
  • 45
  • Thank you. I'm writing to a desktop and haven't exceeded the limit. I know because I can get a large object written to the database by deleting each component property immediately after it is written to an object store. However, during the transaction over 1GB of RAM is used to write an object a small fraction of that size. I will look into big-json. Thanks again. – Gary Jun 02 '18 at 04:01
  • no problem, if his works for you please accept the answer so that future developers can know how to solve this problem – HelloWorld Jun 02 '18 at 04:03
  • I didn't know about big-json and you provided useful information. However, it's not the answer to the question. It appears that, although big-json allows one to exceed the 512MB limit, it doesn't reduce the amount of RAM it takes to convert an object to a string. The question is how can the amount of RAM used to perform the conversion be reduced in javascript, similar to what the first link accomplished in c#. So, if I accept big-json as the answer, it would likely be misleading to future developers. By the way, in Firefox, is it accurate that the max is 256MB because I can't get near 512. – Gary Jun 02 '18 at 05:23
  • This is a very novice question but does big-json become part of my app or would every user of the app have to have big-json installed on their machine. I don't think a user of the app will ever hit 512MB or even 256MB, but I'd like to know anyway, if you don't mnd. Thank you. – Gary Jun 02 '18 at 05:33
  • add it as a dependency in your package.json. If this is a desktop app written in node, then the user would need to install node. You could add nodejs itself as a dependency in the desktop installer. If it's a webapp, as long as nodejs runtime is installed on your server you are all good – HelloWorld Jun 06 '18 at 03:22