0

I'm working on a NodeJS app, and I need to create a JSON file out of a MongoDB collection. I need to perform some logic on each object, so I'm streaming a Mongo cursor through a transform pipe and then writing to a JSON file.

However, the JSON file turns to be a bunch of individual objects. I would like to wrap all documents into a single JSON object, but I can't figure out how.

I am able to get the desired result by querying the collection .toArray(), then performing the logic on the whole array and using Object.assign({}, array), but as you can imagine this takes forever and does not take advantage of streams.

Dan-DH
  • 19
  • 1
  • 7
  • `mongoexport` can get the collection data as JSON - it is an efficient tool. Maybe you can consider doing aggregate operations on the query data and then writing the result to the JSON. Reading data from a cursor and performing data transformation in the client program is not efficient. – prasad_ Apr 20 '22 at 08:59
  • Indeed, but unfortunately some of the fields I need to add to the objects are not in the collection itself. Plus I need the file to be able to be downloaded via URL. – Dan-DH Apr 20 '22 at 09:05
  • Also, cursor is not data - it is only a reference to data. You cannot directly write a cursor to a file. You use cursor methods to access the actual data. – prasad_ Apr 20 '22 at 09:07
  • You mean like ```.stream()``` ? I'm using this ```this.db.collection(collection).find().stream().pipe(transformStream)``` to get the data into the file. – Dan-DH Apr 20 '22 at 09:11
  • The `collection.find()` method returns a Cursor object (this is with native and NodeJS driver code). See the StackOverflow post [What is a cursor in MongoDB?](https://stackoverflow.com/questions/36766956/what-is-a-cursor-in-mongodb/68212051#68212051). – prasad_ Apr 20 '22 at 09:13

0 Answers0