2

In the mongodb docker page there's the following tutorial on how to dump a mongodb collection:

sudo docker exec container_name sh -c 'exec mongodump -d collection_name --archive' > /home/mongo_backup/all-collections.archive

I thougth of creating another container which runs this dump periodically (twice a day for example) and saves to a folder that is mounted inside it. But can I mongodump from a container that is not the container that has the collections? Can I mongodump via local network?

After this is solved, there's still the problem of where to send this backup. It can't be on the same place as my code deployed because it can be erased accidentaly.

Is this a good backup strategy? Any better ideas?

Guerlando OCs
  • 1,886
  • 9
  • 61
  • 150
  • Have you tried mapping the db to a external volume on your computer? You should be able to remove and rebuild the container using that as a persistent volume and also backup that location. – Cory C Feb 15 '20 at 01:09
  • @CoryC yes, the mongodb is mapped, but not to a specific directory, just a docker directory, I just gave it a name. I didn't bother to map to a file – Guerlando OCs Feb 15 '20 at 01:13
  • You may create custom Dockerfile ([Mongodb](https://stackoverflow.com/a/39365267/3710490) + [cron](https://stackoverflow.com/questions/37458287/how-to-run-a-cron-job-inside-a-docker-container)) and schedule mongodump. Or, `volume` mongodb binaries to your local directory, create separate `cron` container and give access to MongoDB binnaries – Valijon Feb 15 '20 at 10:24

1 Answers1

11

I thougth of creating another container which runs this dump periodically [...] there's still the problem of where to send this backup

mgob - "MongoDB dockerized backup agent" does just that: it's a container running mongodump periodically with features to upload generated dump to various Clouds, S3 and SFTP.

We've been using it for some time both with Docker and Kubernetes with good results.

Pierre B.
  • 11,612
  • 1
  • 37
  • 58