0

I need your help!

I started using docker this week, launched all containers for a new Django project. In this project there are several databases, python, django web server + redis, celery, etc. These all are served by separated docker containers and are launched by docker-compose up command.

This is my probjem: when I type docker-compose up in the console, it starts all services. Then I need to restore my databases dumps for each database (it takes about an hour). But when I use pycharm tools for docker-compose, it recreates some containers. And also it recreates all my postgres databases with ALL MY DATA!

Sometimes is doesn't recreate containers and I can do my job, but if I do any wrong move -then docker-compose erases my databases! I have tired to restore them!

Is there way to protect containers from erasing, to forbid recreate my postgres containers?

PS: I've also tried to export postgres containers to .tar file, but when I import it back, database insight the container is ok and container importing is faster than restoring data from sql, but metadata of docker image is different, so I can't use it.

Please, give me any ideas)

  • 1
    A Docker container is a wrapper around a single process; you're trying to ask, "can I prevent a process from ever exiting", which, no, you can't really. If you're losing your database data on restart then you probably need to arrange for it to be saved somewhere; see for example [How to persist data in a dockerized postgres database using volumes](https://stackoverflow.com/questions/41637505/how-to-persist-data-in-a-dockerized-postgres-database-using-volumes). It is extremely normal to delete and recreate containers. – David Maze Nov 18 '21 at 13:56

1 Answers1

1

Try to use volumes to store your data. Volumes can keep data after containers recreating. https://docs.docker.com/storage/volumes/

oklymeno
  • 71
  • 4