I have a collection that requires frequent updating. The structure is as follows:
{
ObjectId _Id;
string EntryKey;
string Description;
string SearchTerm;
}
There is a unique index on the EntryKey
field. I am creating upsert operations like so:
var filter = filterBuilder.Eq(e => e.EntryKey, model.EntryKey);
var update = updateBuilder.Combine(
updateBuilder.SetOnInsert(e => e._Id, ObjectId.GenerateNewId()),
updateBuilder.SetOnInsert(e => e.EntryKey, model.EntryKey.ToLower()),
updateBuilder.Set(e => e.Description, model.Description),
updateBuilder.Set(e => e.SearchTerm, model.SearchTerm));
var op = new UpdateOneModel<SearchEntry>(filter, update){ IsUpsert = true, Hint = "EntryKey_1" };
bulkList.Add(op);
Using the exact same input data for each test with a fresh mongo instance, the first iteration succeeds and the second one fails with E10000: duplicate key error collection
on the EntryKey field
When I remove the unique constraint from the index, duplicated documents are created in the collection (with the exception of _Id
)
When I run the command in the shell using the same failed ID, the command succeeds and the document is updated.
db.search.bulkWrite([
{
updateOne: {
"filter": { "EntryKey": "ad544f72-496f-4eee-bf53-ffdda57de824" },
"update": {
$setOnInsert: {
_id: ObjectId(),
"EntryKey": "ad544f72-496f-4eee-bf53-ffdda57de824"
},
$set: {
"Description": "This is a description",
"SearchTerm",: "search"
}
},
"upsert": true
}
}
]);
I expect the documents that match the filter predicate to get updated instead of either throwing a duplicate key error (when the unique index is enforced) or insert essentially duplicate documents.
I see in this question the accepted answer is to separate updates from inserts. If that's the case, then what's the point of upsert if it cannot be used in the manner being tried here?