Situation: I have several documents of the same collection(account), each has an attribute of type array(string) named uniquestrings.
Problem: Each entry in uniquestrings must be unique over all documents in mongodb. It seems that MongoDB/Mongoose does not offer such a validation (neither addToSet¹ nor index: {unique: true}² solve the problem). Is there a pattern to restructure my document schema to make sure mongodb itself can validate it? At the moment the software itself checks it before updating the document.
E.g.
account {
_id: 111,
uniquestrings: ["a", "b", "c"]
}
account {
_id: 222,
uniquestrings: ["d", "e", "f"]
}
E.g. prevent account(222).uniquestrings.push("a");
by throwing a duplicate error from mongo.
¹ Uniqueness in an array is not enough
² Each item in array has to be unique across the collection
UPDATE1:
More examples. Affected Schema entry looks like:
var Account = new Schema({
...
uniquestrings: [{type: String, index: {unique: true}}]
...
});
Now when create 4 account documents. I want only 1 and 2 be ok, and rest should fail.
var accountModel1 = new Account.Model({uniquestrings: ["a", "b"]});
accountModel1.save(); // OK
var accountModel2 = new Account.Model({uniquestrings: ["c", "d"]});
accountModel2.save(); // OK
var accountModel3 = new Account.Model({uniquestrings: ["c", "d"]});
accountModel3.save(); // FAIL => Not unique, so far so good
var accountModel4 = new Account.Model({uniquestrings: ["X", "d"]});
accountModel4.save(); // OK => But i Want this to faile because "d" is alreay in use.