I am looking for an efficient way to copy a number of documents from one collection to another while adding an attribute. Basically, I have a "spool space" collection where the documents pile up, and at a given time, I have to add a special metadata node and push these in the documents collection.
Limitations and environment:
- we are using the java-driver
- we do not know the document structure
- Mongo-Version is 3.2
Right now, we are doing it document-wise like this:
spool.find(
eq("transfernode.id", currTransferId)
)
.forEach((Consumer<Document>) doc -> {
doc.append("execnode", createMarkerNode("somedata")); // creates a sub-doc
docs.insertOne(doc);
});
However, this is very inefficient.
The perfect thing would be to use the aggregation pipeline and add the node in the $project phase, but unfortunately you have to include a list of fields for the projection, and as I already noted, we do not have such a list as the structure is unknown (i.e. up to the user.)
(Sub-question: can the project-phase somehow be used to include everything without knowing how this everything looks like?)
The next experiment was using MongoDB mapReduce. Basically, this works like a breeze, but the result documents from map-reduce have the form
{
_id: "anId",
value: {
... (this is the document I want)
}
}
(Note: I have seen the answer to the question [1]: mongoDB map/reduce minus the reduce but calling db.insert in the finalize function is out of question. Mongo docs state that this is to be avoided absolutely, and I want to keep the solution "clean")
Maybe there is a way to just "copy up" the value-part to new documents in a "bulk" way?
So, does anyone have an idea how to achive this?
Edit: in the original question text above, I mentioned that the aggregation pipeline seems a possible way. In the meantime I discovered, that the aggregation $out stage replaces a collection instead of updating it. Therefore, this cannot be used either. :-(