There's 2 approaches.
The best way (in my eyes) is to use Mongo Atlas Datalake.
They have a guide here showing how to copy from mongo to S3 continuously but the same approach can also be used to copy from one MongoDB database to another.
To summarise and supplement the docs linked the rough approach is:
- Create a DataLake and link DataLake to MongoDB database in Atlas
- Create a Scheduled trigger to run every 24 hours.
See pseudocode snippet below
exports = async function() {
const movies = context.services
.get("DataLake0")
.db("Database0")
.collection("Collection0");
const pipeline = [
{
$match: {}
}, {
"$out": {
"atlas": {
"projectId": "111111111111111111111111",
"clusterName": "<YourOtherCluster>",
"db": "<YourOtherDB>",
"coll": "<CollectionName>"
}
}
}
];
return movies.aggregate(pipeline);
};
I did this mostly through the UI but there's a way to do it programmatically shown here:
DataLake API docs
Another approach which is recommended in a similar question
(Not directly what you're asking about the API; only mentioning it since it's related)
The approach is to use:
mongodump --host="mongodb0.example.com"
then
mongorestore --host="mongodb0.example.com
See linked question or docs for more details.