6

Question is related to unique compound index unlike other such questions which have unique index only. I also have sparse: true for the indexes.

I've the following indexes in my collection

[
  {
    "v": 2,
    "key": {
      "_id": 1
    },
    "name": "_id_",
    "ns": "somedb.votes"
  },
  {
    "v": 2,
    "key": {
      "answerId": 1
    },
    "name": "answerId_1",
    "ns": "somedb.votes",
    "sparse": true,
    "background": true
  },
  {
    "v": 2,
    "key": {
      "questionId": 1
    },
    "name": "questionId_1",
    "ns": "somedb.votes",
    "sparse": true,
    "background": true
  },
  {
    "v": 2,
    "unique": true,
    "key": {
      "answerId": 1,
      "votedBy": 1
    },
    "name": "answerId_1_votedBy_1",
    "ns": "somedb.votes",
    "sparse": true,
    "background": true
  },
  {
    "v": 2,
    "unique": true,
    "key": {
      "questionId": 1,
      "votedBy": 1
    },
    "name": "questionId_1_votedBy_1",
    "ns": "somedb.votes",
    "sparse": true,
    "background": true
  }
]

and I've the following document in the collection

{
  "_id": ObjectId("59fdd3ce915511329553dfaa"),
  "updatedAt": ISODate("2017-11-04T14:54:22.110Z"),
  "votedAt": ISODate("2017-11-04T14:50:54.681Z"),
  "questionId": ObjectId("59fc77e45a857465a90339cc"),
  "value": -1,
  "votedBy": ObjectId("59fc4274aa686d39abe5d58a"),
  "type": "QuestionVote",
  "__v": 0
}

Now when I try to execute the following

db.votes.insert({ questionId: ObjectId("59fc798d5a857465a90339cf"), value: -1, votedBy: ObjectId("59fc4274aa686d39abe5d58a"), type: 'QuestionVote', _id: ObjectId("5a003240bfd8194a02d0add8") })

I get the following error

E11000 duplicate key error collection: somedb.votes index: answerId_1_votedBy_1 dup key: { : null, : ObjectId('59fc4274aa686d39abe5d58a') }
WriteResult({
  "nInserted": 0,
  "writeError": {
    "code": 11000,
    "errmsg": "E11000 duplicate key error collection: somedb.votes index: answerId_1_votedBy_1 dup key: { : null, : ObjectId('59fc4274aa686d39abe5d58a') }"
  }
})

I don't understand the reason. The indexes are sparse and compound. But the error is just because of presence of the same votedBy field.

i.e. Executing the following,

db.votes.insert({votedBy: ObjectId("59fc4274aa686d39abe5d58a")})

I get the following error even if there is no explicit indexing on the votedBy object.

E11000 duplicate key error collection: somedb.votes index: answerId_1_votedBy_1 dup key: { : null, : ObjectId('59fc4274aa686d39abe5d58a') }
WriteResult({
  "nInserted": 0,
  "writeError": {
    "code": 11000,
    "errmsg": "E11000 duplicate key error collection: somedb.votes index: answerId_1_votedBy_1 dup key: { : null, : ObjectId('59fc4274aa686d39abe5d58a') }"
  }
})

Ref: Compound Index - https://docs.mongodb.com/manual/core/index-compound/#compound-indexes

abhisekp
  • 4,648
  • 2
  • 30
  • 37
  • Because just like all questions that have already been asked before `answerId_1_votedBy_1 dup key: { : null,` indicates that your `answerId` was not supplied in every single document and therefore has `null`, which you asked to be "unique". Even in **compound** you can still only have **one**. So what you do exactly the same as all other questions is add "sparse" or a "partial index filter" in modern versions, to cater for the documents where you did not add this value which is part of the compound index. – Neil Lunn Nov 06 '17 at 10:58
  • @NeilLunn I've added the `sparse`. But still i'm getting that error. You may see the indexes list. `sparse: true` is present. – abhisekp Nov 06 '17 at 11:06
  • 1
    So the whole reason it's blowing up on you is because you are "overlapping" on the `votedBy` field with two other keys that could make the compound. Essentially this means one user gets 1 vote on an answer, 1 vote on a question, and 1 vote that actually does not get assigned to either. That's what you are enforcing here and don't seem to understand that point. Why you want instead is `"_id", "type", "voted_by"` where there is **one** unique index and you dicern on the value of "type" and not the name of field `answerId/questionId`. That fixed naming and alternation is what causes the problem. – Neil Lunn Nov 06 '17 at 11:09
  • @NeilLunn But I cannot have `_id` as either `questionId` or `answerId` as there can be same `questionId`s with different `votedBy`s. What should I do and how should I structure? – abhisekp Nov 06 '17 at 11:20
  • 1
    I don't mean "literally" the primary key `_id`. Simply have **one field** for this value rather than **two**. – Neil Lunn Nov 06 '17 at 11:23

1 Answers1

10

This is maybe due that there old indexes to the same collection, as checked from the console with this method:

db.votes.getIndexes()

Then drop them:

db.votes.dropIndexes()

In your application where you define your schema you should be indexing your compound index like this:

<your_schema_name>.index({
  field1: 1,
  field2:1,
  etc...
},{
    unique: true,
    sparse: true  //N.B:(sparse:true) is not a must for the compound unique indexing to work
});

Now restart your application and the last final desired compound indexing should work.

Extra Note

I found out when creating unique indexes and there already records in your DB that violate this creation, it doesn't work.

From node.js:

From my mongoose driver debugger, i can see that the driver tries to index

Mongoose: <my_collection>.ensureIndex({ field1: 1, 'field2.xx.xxx': 1, field: 3 }, { unique: true, background: true })

and I didn't receive any error from node but when I checked from the console with getIndexes() method, i didn't find the new indexing.

From the console:

I tried to ensureIndex

db.<my_collection>.ensureIndex({ field1: 1, 'field2.xx.xxx': 1, field: 3 }, { unique: true, background: true })

I got an error with the record that violate this indexing

{
    "ok" : 0,
    "errmsg" : "E11000 duplicate key error collection: <db_name>.<collection_name> index: field1_xxx.xxxx.xxxx_1_filed2_1 dup key: { : \"xxx\", : \"xxxx\", : ObjectId('xxx') }",
    "code" : 11000
}

Conclusion

Unique Indexing(compound or not) will not work if there any record that violates the new desired indexing, even if you drop all your indexes and retry to reIndex as I mentioned before.

Farouk Elkholy
  • 913
  • 7
  • 10
  • 1
    I've had the collection dropped and recreated. But the issue is as pointed by @NeilLunn in the comments to the question. And the solution is to create the same field for both the schemas and index them rather than having two different fields. – abhisekp Nov 10 '17 at 04:26