Is there a way to reduce memory usage on client when converting a large javascript object to a string using JSON.stringify?
I'm looking for something that addresses the question below but for javascript on the client.
Writing JSON to a stream without buffering the string in memory
When I try a simple JSON.stringify( big_object ) it quickly takes up all the RAM and freezes my computer.
The same memory-usage issue takes place when I try to write a large object to indexedDB, as described in detail here.
Example of memory leak in indexedDB at store.add (see Example at Edit)
These two questions from three years ago seem to have the same problem, but I can't find that a solution was ever found.
How can I make a really long string using IndexedDB without crashing the browser?
The larger question is this: in an off-line web app in which the user can accumulate a large amount of data in an indexedDB database, the process to back that up to the hard disk appears to be to write the data to an object, convert the object to a string, the string to a blob of text, and download the blob to disk. And to upload the file and write it back to the database, perform the reverse. However, the JSON.stringify and JSON.parse on the large object grabs all the memory and crashes the browser or entire computer.
This link appears to state that the large blob issue in indexedDB has been resolved, but that doesn't appear to solve this problem, does it? The object can't be directly converted to a blob, can it? And, if so, can the organized object be recovered from a blob?
Erratic IndexedDB large file store operations cause IndexedDB consume a large amount of memory that is not freed. https://bugzilla.mozilla.org/show_bug.cgi?id=1223782
Apart from having the user download and upload several files to backup and restore their work saved in the database, is there another way to accomplish this when it's all on the client and off-line?
Thank you for any direction you can provide.