309

How can I (in MongoDB) combine data from multiple collections into one collection?

Can I use map-reduce and if so then how?

I would greatly appreciate some example as I am a novice.

KARTHIKEYAN.A
  • 18,210
  • 6
  • 124
  • 133
user697697
  • 3,327
  • 3
  • 18
  • 12
  • 20
    Do you just want to copy docs from different collections into one single collection or what's your plan? Can you specify "combine"? If you just want to copy via mongo shell a `db.collection1.find().forEach(function(doc){db.collection2.save(doc)});` is enough. Please specify your used driver (java, php, ...) if you don't use mongo shell. – proximus Apr 15 '11 at 22:11
  • 1
    so i have a collection (say users) than have other collections says address book collection, list of books collections, etc. How can i based on the say user_id key combine these collections in to just one single collection. ? – user697697 Apr 18 '11 at 13:53
  • 3
    Related: http://stackoverflow.com/q/2350495/435605 – AlikElzin-kilaka Dec 12 '16 at 14:38

11 Answers11

199

MongoDB 3.2 now allows one to combine data from multiple collections into one through the $lookup aggregation stage. As a practical example, lets say that you have data about books split into two different collections.

First collection, called books, having the following data:

{
    "isbn": "978-3-16-148410-0",
    "title": "Some cool book",
    "author": "John Doe"
}
{
    "isbn": "978-3-16-148999-9",
    "title": "Another awesome book",
    "author": "Jane Roe"
}

And the second collection, called books_selling_data, having the following data:

{
    "_id": ObjectId("56e31bcf76cdf52e541d9d26"),
    "isbn": "978-3-16-148410-0",
    "copies_sold": 12500
}
{
    "_id": ObjectId("56e31ce076cdf52e541d9d28"),
    "isbn": "978-3-16-148999-9",
    "copies_sold": 720050
}
{
    "_id": ObjectId("56e31ce076cdf52e541d9d29"),
    "isbn": "978-3-16-148999-9",
    "copies_sold": 1000
}

To merge both collections is just a matter of using $lookup in the following way:

db.books.aggregate([{
    $lookup: {
            from: "books_selling_data",
            localField: "isbn",
            foreignField: "isbn",
            as: "copies_sold"
        }
}])

After this aggregation, the books collection will look like the following:

{
    "isbn": "978-3-16-148410-0",
    "title": "Some cool book",
    "author": "John Doe",
    "copies_sold": [
        {
            "_id": ObjectId("56e31bcf76cdf52e541d9d26"),
            "isbn": "978-3-16-148410-0",
            "copies_sold": 12500
        }
    ]
}
{
    "isbn": "978-3-16-148999-9",
    "title": "Another awesome book",
    "author": "Jane Roe",
    "copies_sold": [
        {
            "_id": ObjectId("56e31ce076cdf52e541d9d28"),
            "isbn": "978-3-16-148999-9",
            "copies_sold": 720050
        },
        {
            "_id": ObjectId("56e31ce076cdf52e541d9d28"),
            "isbn": "978-3-16-148999-9",
            "copies_sold": 1000
        }
    ]
}

It is important to note a few things:

  1. The "from" collection, in this case books_selling_data, cannot be sharded.
  2. The "as" field will be an array, as the example above.
  3. Both "localField" and "foreignField" options on the $lookup stage will be treated as null for matching purposes if they don't exist in their respective collections (the $lookup docs has a perfect example about that).

So, as a conclusion, if you want to consolidate both collections, having, in this case, a flat copies_sold field with the total copies sold, you will have to work a little bit more, probably using an intermediary collection that will, then, be $out to the final collection.

Dev01
  • 13,292
  • 19
  • 70
  • 124
Bruno Krebs
  • 3,059
  • 1
  • 24
  • 36
  • hi there, kindly can you tell what will be the optimized way to manage data like this : User, file.files and file.chunks are three collections , i want specific user with all its related file in a response is it possible.? { "name" : "batMan", "email' : "bt@gmail.com", "files" : [ {file1},{file2},{file3},.... so on ] } – mfaisalhyder May 19 '16 at 09:53
  • Official documentation examples for the above solution can be found here: https://docs.mongodb.com/manual/reference/operator/aggregation/lookup/ – Jakub Czaplicki Jun 17 '16 at 11:38
  • 4
    Well, actually my answer already had three links to the official documentation. But thanks for your contribution anyway. @JakubCzaplicki – Bruno Krebs Jun 19 '16 at 14:00
  • 2
    I might be having a total brain malfunction (most likely) but in ```$lookup``` shouldn't all both "localField" and "foreignField" equal "isbn"? not "_id" and "isbn"? – Dev01 Aug 31 '16 at 14:44
154

Although you can't do this real-time, you can run map-reduce multiple times to merge data together by using the "reduce" out option in MongoDB 1.8+ map/reduce (see http://www.mongodb.org/display/DOCS/MapReduce#MapReduce-Outputoptions). You need to have some key in both collections that you can use as an _id.

For example, let's say you have a users collection and a comments collection and you want to have a new collection that has some user demographic info for each comment.

Let's say the users collection has the following fields:

  • _id
  • firstName
  • lastName
  • country
  • gender
  • age

And then the comments collection has the following fields:

  • _id
  • userId
  • comment
  • created

You would do this map/reduce:

var mapUsers, mapComments, reduce;
db.users_comments.remove();

// setup sample data - wouldn't actually use this in production
db.users.remove();
db.comments.remove();
db.users.save({firstName:"Rich",lastName:"S",gender:"M",country:"CA",age:"18"});
db.users.save({firstName:"Rob",lastName:"M",gender:"M",country:"US",age:"25"});
db.users.save({firstName:"Sarah",lastName:"T",gender:"F",country:"US",age:"13"});
var users = db.users.find();
db.comments.save({userId: users[0]._id, "comment": "Hey, what's up?", created: new ISODate()});
db.comments.save({userId: users[1]._id, "comment": "Not much", created: new ISODate()});
db.comments.save({userId: users[0]._id, "comment": "Cool", created: new ISODate()});
// end sample data setup

mapUsers = function() {
    var values = {
        country: this.country,
        gender: this.gender,
        age: this.age
    };
    emit(this._id, values);
};
mapComments = function() {
    var values = {
        commentId: this._id,
        comment: this.comment,
        created: this.created
    };
    emit(this.userId, values);
};
reduce = function(k, values) {
    var result = {}, commentFields = {
        "commentId": '', 
        "comment": '',
        "created": ''
    };
    values.forEach(function(value) {
        var field;
        if ("comment" in value) {
            if (!("comments" in result)) {
                result.comments = [];
            }
            result.comments.push(value);
        } else if ("comments" in value) {
            if (!("comments" in result)) {
                result.comments = [];
            }
            result.comments.push.apply(result.comments, value.comments);
        }
        for (field in value) {
            if (value.hasOwnProperty(field) && !(field in commentFields)) {
                result[field] = value[field];
            }
        }
    });
    return result;
};
db.users.mapReduce(mapUsers, reduce, {"out": {"reduce": "users_comments"}});
db.comments.mapReduce(mapComments, reduce, {"out": {"reduce": "users_comments"}});
db.users_comments.find().pretty(); // see the resulting collection

At this point, you will have a new collection called users_comments that contains the merged data and you can now use that. These reduced collections all have _id which is the key you were emitting in your map functions and then all of the values are a sub-object inside the value key - the values aren't at the top level of these reduced documents.

This is a somewhat simple example. You can repeat this with more collections as much as you want to keep building up the reduced collection. You could also do summaries and aggregations of data in the process. Likely you would define more than one reduce function as the logic for aggregating and preserving existing fields gets more complex.

You'll also note that there is now one document for each user with all of that user's comments in an array. If we were merging data that has a one-to-one relationship rather than one-to-many, it would be flat and you could simply use a reduce function like this:

reduce = function(k, values) {
    var result = {};
    values.forEach(function(value) {
        var field;
        for (field in value) {
            if (value.hasOwnProperty(field)) {
                result[field] = value[field];
            }
        }
    });
    return result;
};

If you want to flatten the users_comments collection so it's one document per comment, additionally run this:

var map, reduce;
map = function() {
    var debug = function(value) {
        var field;
        for (field in value) {
            print(field + ": " + value[field]);
        }
    };
    debug(this);
    var that = this;
    if ("comments" in this.value) {
        this.value.comments.forEach(function(value) {
            emit(value.commentId, {
                userId: that._id,
                country: that.value.country,
                age: that.value.age,
                comment: value.comment,
                created: value.created,
            });
        });
    }
};
reduce = function(k, values) {
    var result = {};
    values.forEach(function(value) {
        var field;
        for (field in value) {
            if (value.hasOwnProperty(field)) {
                result[field] = value[field];
            }
        }
    });
    return result;
};
db.users_comments.mapReduce(map, reduce, {"out": "comments_with_demographics"});

This technique should definitely not be performed on the fly. It's suited for a cron job or something like that which updates the merged data periodically. You'll probably want to run ensureIndex on the new collection to make sure queries you perform against it run quickly (keep in mind that your data is still inside a value key, so if you were to index comments_with_demographics on the comment created time, it would be db.comments_with_demographics.ensureIndex({"value.created": 1});

rmarscher
  • 5,596
  • 2
  • 28
  • 30
  • 1
    I would probably never do that in production software, but it's still a wicked cool technique. – Dave Griffith Jan 05 '12 at 18:02
  • 3
    Thanks, Dave. I used this technique for generating export and reporting tables for a high traffic site in production for the last 3 months without issue. Here's another article that describes a similar use of the technique: http://tebros.com/2011/07/using-mongodb-mapreduce-to-join-2-collections/ – rmarscher Jan 06 '12 at 15:53
  • 1
    Thanks @rmarscher your extra details really helped me to better understand everything. – benstr Nov 10 '14 at 21:29
  • 5
    I should update this answer with an example using the aggregation pipeline and the new $lookup operation. Mentioning it here until I can put together a proper write-up. https://docs.mongodb.org/manual/reference/operator/aggregation/lookup/ – rmarscher Jan 09 '16 at 17:46
  • 1
    FYI for those wanting to quickly grok what this does, here's what's in the `users_comments` collection after the first block of code https://gist.github.com/nolanamy/83d7fb6a9bf92482a1c4311ad9c78835 – Nolan Amy Sep 27 '16 at 22:51
45

Doing unions in MongoDB in a 'SQL UNION' fashion is possible using aggregations along with lookups, in a single query. Here is an example I have tested that works with MongoDB 4.0:

// Create employees data for testing the union.
db.getCollection('employees').insert({ name: "John", type: "employee", department: "sales" });
db.getCollection('employees').insert({ name: "Martha", type: "employee", department: "accounting" });
db.getCollection('employees').insert({ name: "Amy", type: "employee", department: "warehouse" });
db.getCollection('employees').insert({ name: "Mike", type: "employee", department: "warehouse"  });

// Create freelancers data for testing the union.
db.getCollection('freelancers').insert({ name: "Stephany", type: "freelancer", department: "accounting" });
db.getCollection('freelancers').insert({ name: "Martin", type: "freelancer", department: "sales" });
db.getCollection('freelancers').insert({ name: "Doug", type: "freelancer", department: "warehouse"  });
db.getCollection('freelancers').insert({ name: "Brenda", type: "freelancer", department: "sales"  });

// Here we do a union of the employees and freelancers using a single aggregation query.
db.getCollection('freelancers').aggregate( // 1. Use any collection containing at least one document.
  [
    { $limit: 1 }, // 2. Keep only one document of the collection.
    { $project: { _id: '$$REMOVE' } }, // 3. Remove everything from the document.

    // 4. Lookup collections to union together.
    { $lookup: { from: 'employees', pipeline: [{ $match: { department: 'sales' } }], as: 'employees' } },
    { $lookup: { from: 'freelancers', pipeline: [{ $match: { department: 'sales' } }], as: 'freelancers' } },

    // 5. Union the collections together with a projection.
    { $project: { union: { $concatArrays: ["$employees", "$freelancers"] } } },

    // 6. Unwind and replace root so you end up with a result set.
    { $unwind: '$union' },
    { $replaceRoot: { newRoot: '$union' } }
  ]);

Here is the explanation of how it works:

  1. Instantiate an aggregate out of any collection of your database that has at least one document in it. If you can't guarantee any collection of your database will not be empty, you can workaround this issue by creating in your database some sort of 'dummy' collection containing a single empty document in it that will be there specifically for doing union queries.

  2. Make the first stage of your pipeline to be { $limit: 1 }. This will strip all the documents of the collection except the first one.

  3. Strip all the fields of the remaining document by using a $project stage:

    { $project: { _id: '$$REMOVE' } }
    
  4. Your aggregate now contains a single, empty document. It's time to add lookups for each collection you want to union together. You may use the pipeline field to do some specific filtering, or leave localField and foreignField as null to match the whole collection.

    { $lookup: { from: 'collectionToUnion1', pipeline: [...], as: 'Collection1' } },
    { $lookup: { from: 'collectionToUnion2', pipeline: [...], as: 'Collection2' } },
    { $lookup: { from: 'collectionToUnion3', pipeline: [...], as: 'Collection3' } }
    
  5. You now have an aggregate containing a single document that contains 3 arrays like this:

    {
        Collection1: [...],
        Collection2: [...],
        Collection3: [...]
    }
    

    You can then merge them together into a single array using a $project stage along with the $concatArrays aggregation operator:

    {
      "$project" :
      {
        "Union" : { $concatArrays: ["$Collection1", "$Collection2", "$Collection3"] }
      }
    }
    
  6. You now have an aggregate containing a single document, into which is located an array that contains your union of collections. What remains to be done is to add an $unwind and a $replaceRoot stage to split your array into separate documents:

    { $unwind: "$Union" },
    { $replaceRoot: { newRoot: "$Union" } }
    
  7. Voilà. You now have a result set containing the collections you wanted to union together. You can then add more stages to filter it further, sort it, apply skip() and limit(). Pretty much anything you want.

sboisse
  • 4,860
  • 3
  • 37
  • 48
  • The query is failing with the message "$projection requires at least one output field". – supriyo_basak May 08 '19 at 09:53
  • @abhishek If you get that it's because you tried to strip all fields out of the single document in a single projection stage. MongoDB won't let you do this. To workaround this you need to do 2 successive projections where the first one strips everything but the _id, and the second one strips the remaining _id. – sboisse May 09 '19 at 10:21
  • @abhishek I have simplified even more the query by replacing the $project stages in a single one that uses the '$$REMOVE' variable. I also added a concrete example that you can just copy and paste directly in your query tester to see that it works. – sboisse May 09 '19 at 13:39
  • @sboisse, this solution works for smaller collections, however, if I want to do this on big collections, (100,000+ docs), I run into a "Total size of documents in collectionToUnion1 exceeds maximum document size". In the docs, it suggests putting an $unwind directly after the $lookup to avoid creating large intermediate documents. I have not been successful at modifying this solution using that method. Have you run into this issue and had to use that method? Link to the docs I'm ref to: [link] (https://docs.mongodb.com/manual/core/aggregation-pipeline-optimization/#lookup-unwind-coalescence) – lucky7samson Jul 15 '19 at 19:33
  • @lucky7samson unfortunately the amount of data I had to deal with was not that large. So I did not have to confront the problem you are referring to. In my case I could apply filtering on the collection to lookup before merging the records with the rest, so the amount of data to union was quite small. – sboisse Aug 16 '19 at 18:03
  • 1
    kudos for such a detailed explanation of each step – Kevin Südmersen Jul 07 '20 at 08:39
  • 1
    @sboisse how will this query perform on large collections? – Ankita Dec 26 '20 at 07:09
  • 3
    @ankita my personal experience with this approach has been very satisfying so far for performance. But if you need to do aggregation in a SQL UNION fashion, I don't see an alternative. If you have performance issues with this approach, I would look into optimizing my queries in the pipelines of the lookups, and add proper indexing of looked up collections. The more you filter out in the initial steps of the pipeline, the better. At step 1, I would also try to choose a small collection. Perhaps a collection that contains exactly one document so that this steps is as fast as possible. – sboisse Dec 27 '20 at 08:15
  • @sboisse have you tried converting the query to java using spring mongo? I am having a hard time doing that. Can you provide some insights please? – Ankita Jan 07 '21 at 15:41
  • @ankita - never worked in Java sorry. You should make it a separate Stack Overflow question, this is off topic. – sboisse Jan 08 '21 at 18:31
  • your method is one of a kind, had been looking for this for many hours, havent seen this anywhere, how did you come up with this? – Ali80 Mar 18 '21 at 12:42
20

Starting Mongo 4.4, we can achieve this join within an aggregation pipeline by coupling the new $unionWith aggregation stage with $group's new $accumulator operator:

// > db.users.find()
//   [{ user: 1, name: "x" }, { user: 2, name: "y" }]
// > db.books.find()
//   [{ user: 1, book: "a" }, { user: 1, book: "b" }, { user: 2, book: "c" }]
// > db.movies.find()
//   [{ user: 1, movie: "g" }, { user: 2, movie: "h" }, { user: 2, movie: "i" }]
db.users.aggregate([
  { $unionWith: "books"  },
  { $unionWith: "movies" },
  { $group: {
    _id: "$user",
    user: {
      $accumulator: {
        accumulateArgs: ["$name", "$book", "$movie"],
        init: function() { return { books: [], movies: [] } },
        accumulate: function(user, name, book, movie) {
          if (name) user.name = name;
          if (book) user.books.push(book);
          if (movie) user.movies.push(movie);
          return user;
        },
        merge: function(userV1, userV2) {
          if (userV2.name) userV1.name = userV2.name;
          userV1.books.concat(userV2.books);
          userV1.movies.concat(userV2.movies);
          return userV1;
        },
        lang: "js"
      }
    }
  }}
])
// { _id: 1, user: { books: ["a", "b"], movies: ["g"], name: "x" } }
// { _id: 2, user: { books: ["c"], movies: ["h", "i"], name: "y" } }
  • $unionWith combines records from the given collection within documents already in the aggregation pipeline. After the 2 union stages, we thus have all users, books and movies records within the pipeline.

  • We then $group records by $user and accumulate items using the $accumulator operator allowing custom accumulations of documents as they get grouped:

    • the fields we're interested in accumulating are defined with accumulateArgs.
    • init defines the state that will be accumulated as we group elements.
    • the accumulate function allows performing a custom action with a record being grouped in order to build the accumulated state. For instance, if the item being grouped has the book field defined, then we update the books part of the state.
    • merge is used to merge two internal states. It's only used for aggregations running on sharded clusters or when the operation exceeds memory limits.
Xavier Guihot
  • 54,987
  • 21
  • 291
  • 190
19

Very basic example with $lookup.

db.getCollection('users').aggregate([
    {
        $lookup: {
            from: "userinfo",
            localField: "userId",
            foreignField: "userId",
            as: "userInfoData"
        }
    },
    {
        $lookup: {
            from: "userrole",
            localField: "userId",
            foreignField: "userId",
            as: "userRoleData"
        }
    },
    { $unwind: { path: "$userInfoData", preserveNullAndEmptyArrays: true }},
    { $unwind: { path: "$userRoleData", preserveNullAndEmptyArrays: true }}
])

Here is used

 { $unwind: { path: "$userInfoData", preserveNullAndEmptyArrays: true }}, 
 { $unwind: { path: "$userRoleData", preserveNullAndEmptyArrays: true }}

Instead of

{ $unwind:"$userRoleData"} 
{ $unwind:"$userRoleData"}

Because { $unwind:"$userRoleData"} this will return empty or 0 result if no matching record found with $lookup.

Anish Agarwal
  • 1,793
  • 19
  • 19
14

If there is no bulk insert into mongodb, we loop all objects in the small_collection and insert them one by one into the big_collection:

db.small_collection.find().forEach(function(obj){ 
   db.big_collection.insert(obj)
});
Hieu Le
  • 2,106
  • 1
  • 23
  • 23
14

use multiple $lookup for multiple collections in aggregation

query:

db.getCollection('servicelocations').aggregate([
  {
    $match: {
      serviceLocationId: {
        $in: ["36728"]
      }
    }
  },
  {
    $lookup: {
      from: "orders",
      localField: "serviceLocationId",
      foreignField: "serviceLocationId",
      as: "orders"
    }
  },
  {
    $lookup: {
      from: "timewindowtypes",
      localField: "timeWindow.timeWindowTypeId",
      foreignField: "timeWindowTypeId",
      as: "timeWindow"
    }
  },
  {
    $lookup: {
      from: "servicetimetypes",
      localField: "serviceTimeTypeId",
      foreignField: "serviceTimeTypeId",
      as: "serviceTime"
    }
  },
  {
    $unwind: "$orders"
  },
  {
    $unwind: "$serviceTime"
  },
  {
    $limit: 14
  }
])

result:

{
    "_id" : ObjectId("59c3ac4bb7799c90ebb3279b"),
    "serviceLocationId" : "36728",
    "regionId" : 1.0,
    "zoneId" : "DXBZONE1",
    "description" : "AL HALLAB REST EMIRATES MALL",
    "locationPriority" : 1.0,
    "accountTypeId" : 1.0,
    "locationType" : "SERVICELOCATION",
    "location" : {
        "makani" : "",
        "lat" : 25.119035,
        "lng" : 55.198694
    },
    "deliveryDays" : "MTWRFSU",
    "timeWindow" : [ 
        {
            "_id" : ObjectId("59c3b0a3b7799c90ebb32cde"),
            "timeWindowTypeId" : "1",
            "Description" : "MORNING",
            "timeWindow" : {
                "openTime" : "06:00",
                "closeTime" : "08:00"
            },
            "accountId" : 1.0
        }, 
        {
            "_id" : ObjectId("59c3b0a3b7799c90ebb32cdf"),
            "timeWindowTypeId" : "1",
            "Description" : "MORNING",
            "timeWindow" : {
                "openTime" : "09:00",
                "closeTime" : "10:00"
            },
            "accountId" : 1.0
        }, 
        {
            "_id" : ObjectId("59c3b0a3b7799c90ebb32ce0"),
            "timeWindowTypeId" : "1",
            "Description" : "MORNING",
            "timeWindow" : {
                "openTime" : "10:30",
                "closeTime" : "11:30"
            },
            "accountId" : 1.0
        }
    ],
    "address1" : "",
    "address2" : "",
    "phone" : "",
    "city" : "",
    "county" : "",
    "state" : "",
    "country" : "",
    "zipcode" : "",
    "imageUrl" : "",
    "contact" : {
        "name" : "",
        "email" : ""
    },
    "status" : "ACTIVE",
    "createdBy" : "",
    "updatedBy" : "",
    "updateDate" : "",
    "accountId" : 1.0,
    "serviceTimeTypeId" : "1",
    "orders" : [ 
        {
            "_id" : ObjectId("59c3b291f251c77f15790f92"),
            "orderId" : "AQ18O1704264",
            "serviceLocationId" : "36728",
            "orderNo" : "AQ18O1704264",
            "orderDate" : "18-Sep-17",
            "description" : "AQ18O1704264",
            "serviceType" : "Delivery",
            "orderSource" : "Import",
            "takenBy" : "KARIM",
            "plannedDeliveryDate" : ISODate("2017-08-26T00:00:00.000Z"),
            "plannedDeliveryTime" : "",
            "actualDeliveryDate" : "",
            "actualDeliveryTime" : "",
            "deliveredBy" : "",
            "size1" : 296.0,
            "size2" : 3573.355,
            "size3" : 240.811,
            "jobPriority" : 1.0,
            "cancelReason" : "",
            "cancelDate" : "",
            "cancelBy" : "",
            "reasonCode" : "",
            "reasonText" : "",
            "status" : "",
            "lineItems" : [ 
                {
                    "ItemId" : "BNWB020",
                    "size1" : 15.0,
                    "size2" : 78.6,
                    "size3" : 6.0
                }, 
                {
                    "ItemId" : "BNWB021",
                    "size1" : 20.0,
                    "size2" : 252.0,
                    "size3" : 11.538
                }, 
                {
                    "ItemId" : "BNWB023",
                    "size1" : 15.0,
                    "size2" : 285.0,
                    "size3" : 16.071
                }, 
                {
                    "ItemId" : "CPMW112",
                    "size1" : 3.0,
                    "size2" : 25.38,
                    "size3" : 1.731
                }, 
                {
                    "ItemId" : "MMGW001",
                    "size1" : 25.0,
                    "size2" : 464.375,
                    "size3" : 46.875
                }, 
                {
                    "ItemId" : "MMNB218",
                    "size1" : 50.0,
                    "size2" : 920.0,
                    "size3" : 60.0
                }, 
                {
                    "ItemId" : "MMNB219",
                    "size1" : 50.0,
                    "size2" : 630.0,
                    "size3" : 40.0
                }, 
                {
                    "ItemId" : "MMNB220",
                    "size1" : 50.0,
                    "size2" : 416.0,
                    "size3" : 28.846
                }, 
                {
                    "ItemId" : "MMNB270",
                    "size1" : 50.0,
                    "size2" : 262.0,
                    "size3" : 20.0
                }, 
                {
                    "ItemId" : "MMNB302",
                    "size1" : 15.0,
                    "size2" : 195.0,
                    "size3" : 6.0
                }, 
                {
                    "ItemId" : "MMNB373",
                    "size1" : 3.0,
                    "size2" : 45.0,
                    "size3" : 3.75
                }
            ],
            "accountId" : 1.0
        }, 
        {
            "_id" : ObjectId("59c3b291f251c77f15790f9d"),
            "orderId" : "AQ137O1701240",
            "serviceLocationId" : "36728",
            "orderNo" : "AQ137O1701240",
            "orderDate" : "18-Sep-17",
            "description" : "AQ137O1701240",
            "serviceType" : "Delivery",
            "orderSource" : "Import",
            "takenBy" : "KARIM",
            "plannedDeliveryDate" : ISODate("2017-08-26T00:00:00.000Z"),
            "plannedDeliveryTime" : "",
            "actualDeliveryDate" : "",
            "actualDeliveryTime" : "",
            "deliveredBy" : "",
            "size1" : 28.0,
            "size2" : 520.11,
            "size3" : 52.5,
            "jobPriority" : 1.0,
            "cancelReason" : "",
            "cancelDate" : "",
            "cancelBy" : "",
            "reasonCode" : "",
            "reasonText" : "",
            "status" : "",
            "lineItems" : [ 
                {
                    "ItemId" : "MMGW001",
                    "size1" : 25.0,
                    "size2" : 464.38,
                    "size3" : 46.875
                }, 
                {
                    "ItemId" : "MMGW001-F1",
                    "size1" : 3.0,
                    "size2" : 55.73,
                    "size3" : 5.625
                }
            ],
            "accountId" : 1.0
        }, 
        {
            "_id" : ObjectId("59c3b291f251c77f15790fd8"),
            "orderId" : "AQ110O1705036",
            "serviceLocationId" : "36728",
            "orderNo" : "AQ110O1705036",
            "orderDate" : "18-Sep-17",
            "description" : "AQ110O1705036",
            "serviceType" : "Delivery",
            "orderSource" : "Import",
            "takenBy" : "KARIM",
            "plannedDeliveryDate" : ISODate("2017-08-26T00:00:00.000Z"),
            "plannedDeliveryTime" : "",
            "actualDeliveryDate" : "",
            "actualDeliveryTime" : "",
            "deliveredBy" : "",
            "size1" : 60.0,
            "size2" : 1046.0,
            "size3" : 68.0,
            "jobPriority" : 1.0,
            "cancelReason" : "",
            "cancelDate" : "",
            "cancelBy" : "",
            "reasonCode" : "",
            "reasonText" : "",
            "status" : "",
            "lineItems" : [ 
                {
                    "ItemId" : "MMNB218",
                    "size1" : 50.0,
                    "size2" : 920.0,
                    "size3" : 60.0
                }, 
                {
                    "ItemId" : "MMNB219",
                    "size1" : 10.0,
                    "size2" : 126.0,
                    "size3" : 8.0
                }
            ],
            "accountId" : 1.0
        }
    ],
    "serviceTime" : {
        "_id" : ObjectId("59c3b07cb7799c90ebb32cdc"),
        "serviceTimeTypeId" : "1",
        "serviceTimeType" : "nohelper",
        "description" : "",
        "fixedTime" : 30.0,
        "variableTime" : 0.0,
        "accountId" : 1.0
    }
}
KARTHIKEYAN.A
  • 18,210
  • 6
  • 124
  • 133
1

Mongorestore has this feature of appending on top of whatever is already in the database, so this behavior could be used for combining two collections:

  1. mongodump collection1
  2. collection2.rename(collection1)
  3. mongorestore

Didn't try it yet, but it might perform faster than the map/reduce approach.

Unihedron
  • 10,902
  • 13
  • 62
  • 72
shauli
  • 402
  • 2
  • 4
0

Yes you can: Take this utility function that I have written today:

function shangMergeCol() {
  tcol= db.getCollection(arguments[0]);
  for (var i=1; i<arguments.length; i++){
    scol= db.getCollection(arguments[i]);
    scol.find().forEach(
        function (d) {
            tcol.insert(d);
        }
    )
  }
}

You can pass to this function any number of collections, the first one is going to be the target one. All the rest collections are sources to be transferred to the target one.

Shangab
  • 39
  • 1
  • 9
-1

Code snippet. Courtesy-Multiple posts on stack overflow including this one.

 db.cust.drop();
 db.zip.drop();
 db.cust.insert({cust_id:1, zip_id: 101});
 db.cust.insert({cust_id:2, zip_id: 101});
 db.cust.insert({cust_id:3, zip_id: 101});
 db.cust.insert({cust_id:4, zip_id: 102});
 db.cust.insert({cust_id:5, zip_id: 102});

 db.zip.insert({zip_id:101, zip_cd:'AAA'});
 db.zip.insert({zip_id:102, zip_cd:'BBB'});
 db.zip.insert({zip_id:103, zip_cd:'CCC'});

mapCust = function() {
    var values = {
        cust_id: this.cust_id
    };
    emit(this.zip_id, values);
};

mapZip = function() {
    var values = {
    zip_cd: this.zip_cd
    };
    emit(this.zip_id, values);
};

reduceCustZip =  function(k, values) {
    var result = {};
    values.forEach(function(value) {
    var field;
        if ("cust_id" in value) {
            if (!("cust_ids" in result)) {
                result.cust_ids = [];
            }
            result.cust_ids.push(value);
        } else {
    for (field in value) {
        if (value.hasOwnProperty(field) ) {
                result[field] = value[field];
        }
         };  
       }
      });
       return result;
};


db.cust_zip.drop();
db.cust.mapReduce(mapCust, reduceCustZip, {"out": {"reduce": "cust_zip"}});
db.zip.mapReduce(mapZip, reduceCustZip, {"out": {"reduce": "cust_zip"}});
db.cust_zip.find();


mapCZ = function() {
    var that = this;
    if ("cust_ids" in this.value) {
        this.value.cust_ids.forEach(function(value) {
            emit(value.cust_id, {
                zip_id: that._id,
                zip_cd: that.value.zip_cd
            });
        });
    }
};

reduceCZ = function(k, values) {
    var result = {};
    values.forEach(function(value) {
        var field;
        for (field in value) {
            if (value.hasOwnProperty(field)) {
                result[field] = value[field];
            }
        }
    });
    return result;
};
db.cust_zip_joined.drop();
db.cust_zip.mapReduce(mapCZ, reduceCZ, {"out": "cust_zip_joined"}); 
db.cust_zip_joined.find().pretty();


var flattenMRCollection=function(dbName,collectionName) {
    var collection=db.getSiblingDB(dbName)[collectionName];

    var i=0;
    var bulk=collection.initializeUnorderedBulkOp();
    collection.find({ value: { $exists: true } }).addOption(16).forEach(function(result) {
        print((++i));
        //collection.update({_id: result._id},result.value);

        bulk.find({_id: result._id}).replaceOne(result.value);

        if(i%1000==0)
        {
            print("Executing bulk...");
            bulk.execute();
            bulk=collection.initializeUnorderedBulkOp();
        }
    });
    bulk.execute();
};


flattenMRCollection("mydb","cust_zip_joined");
db.cust_zip_joined.find().pretty();
-3

You've to do that in your application layer. If you're using an ORM, it could use annotations (or something similar) to pull references that exist in other collections. I only have worked with Morphia, and the @Reference annotation fetches the referenced entity when queried, so I am able to avoid doing it myself in the code.

lobster1234
  • 7,679
  • 26
  • 30