3

Context
We're currently trying to copy a database called DatabaseA on an M2 cluster. We're trying to create backups of our cluster by cloning our database into a new database called DatabaseB every 24 hours using MongoDB Atlas.

We've already looked into the following links:

But we have no idea how to implement this.

How should we approach this?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62

1 Answers1

0

There's 2 approaches.

The best way (in my eyes) is to use Mongo Atlas Datalake. They have a guide here showing how to copy from mongo to S3 continuously but the same approach can also be used to copy from one MongoDB database to another.

To summarise and supplement the docs linked the rough approach is:

  1. Create a DataLake and link DataLake to MongoDB database in Atlas
  2. Create a Scheduled trigger to run every 24 hours. See pseudocode snippet below

exports = async function() {
  const movies = context.services
    .get("DataLake0")
    .db("Database0")
    .collection("Collection0");
    
    const pipeline = [
      {
            $match: {}
      }, {
    "$out": {
      "atlas": {
        "projectId": "111111111111111111111111",
        "clusterName": "<YourOtherCluster>",
        "db": "<YourOtherDB>",
        "coll": "<CollectionName>"
      }
    }
}
   ];
  return movies.aggregate(pipeline);
};
I did this mostly through the UI but there's a way to do it programmatically shown here: DataLake API docs

Another approach which is recommended in a similar question

(Not directly what you're asking about the API; only mentioning it since it's related)

The approach is to use:

  • mongodump --host="mongodb0.example.com"

then

  • mongorestore --host="mongodb0.example.com

See linked question or docs for more details.

Kai
  • 1,709
  • 1
  • 23
  • 36