0

So I have this Schema which has an array

const mongoose = require("mongoose");
const Schema = mongoose.Schema;

let colorsSchema = new Schema({
    value: String
});

let sizesSchema = new Schema({
    value: String
});

let tshirtSchema = new Schema({
    id: { type: String, unique: true },
    name: String,
    url: { type: String, unique: true },
    colors: [colorsSchema],
    sizes: [sizesSchema],
    available: Boolean
}, { versionKey: false });


module.exports = mongoose.model('Tshirt', tshirtSchema);

When I insert a new document it gives me an error saying

 E11000 duplicate key error index: wwc.tshirts.$sizes.value_1 dup key: { : "L" }

I tried giving the fields unique indexes too but the error still persists.

user3627194
  • 9
  • 2
  • 8
  • You'll need to manually drop that index. See https://stackoverflow.com/questions/23695676/mongoosejs-cant-disable-unique-to-field – JohnnyHK Sep 09 '17 at 14:48
  • @JohnnyHK but what if i want repeating data? like many items can have "L" and "XL" – user3627194 Sep 09 '17 at 16:01
  • That's why you need to drop that unique index. Maybe I'm not understanding your question. – JohnnyHK Sep 09 '17 at 16:02
  • The unique index is the problem, which is why you are being told to remove it. If you want to enforce "unique in the array" then you use things like `$addToSet` and not indexes. Indexes apply to the "whole collection" and not just the document. So a "unique index" does not allow `"L"` to exist in any more than **one document**. – Neil Lunn Sep 10 '17 at 01:54

0 Answers0