2

I have an extremely large array of objects stored in a JSON file, which I would like to process and update. The array contains millions of objects. Normally, I would have used JSON.parse() followed by something like

Promise.map(arr, someFunction)
.map(obj => anotherFunction)
.then(arr => console.log('Done.'));

However, the JSON file cannot be loaded into memory all at once. So how can I reduce memory usage? I was thinking of processing them with Nodejs streams, so I could process it a bit at a time. I will also need to add or remove properties of some objects.

frozen
  • 2,114
  • 14
  • 33

0 Answers0