1

I posted this question originally on the Docker forums, but didn't get any response there.

I'm wondering what the best way would be to model a set of services let's call them db, web, and batch. db is simply a running database server instance (think MySQL). web is a web application that needs to connect to the database. batch is a batch application that needs to connect to that same database (it can/will run in parallel with web). db needs to be running, for either web or batch to run. But web or batch can be run independently of each other (one or both can be running at the same time). If both are running at once, they need to be talking to the same database instance (so db is actually using volumes_from a separate data volume container). So if the use case was simpler (think just db and web, which always run together), then they would simply both be defined as services in the same compose file, with web having a link to db.

As far as I understand it, these can't all be defined in the same Docker compose configuration. Instead, I would need three different configurations. One for db, which is launched first, one for web (which uses external_links to find db), and a third for batch (which also uses external_links to find db). Is that correct, or is there some mechanism available that I'm not considering? Assuming a multi-configuration setup is needed, is there a way to "lazily" initialize the db composition if it's not running, when either the web or batch compositions are launched?

Jeff Evans
  • 1,257
  • 1
  • 15
  • 31

2 Answers2

1

If web has a link defined to db in a docker-compose file, db will always start first.

As far as I know, Docker will never know when the database will be up. It will be your web container's responsibility to properly start and retry until the base is up (with a timeout).

For your batch service, assuming that you don't want to start it everytime you start your web and db containers (using a docker-compose up or run), you can try extending your service. See the docs for more informations on this.

-1

Either you applications in the web and batch images known how to handle database down time and are able to wait for the db service to come up and auto-reconnect ; either you have to make a shell script that will be run when the docker container is started to wait for the db to be available before starting the app.

Depending on the docker images you are using for the web and batch services, you would have to override CMD, ENTRYPOINT or both.

This question has examples of shell script which waits for a MySQL service to be up.

And here are other technics for testing if a network port is opened.

Community
  • 1
  • 1
Thomasleveil
  • 95,867
  • 15
  • 119
  • 113
  • That addresses the question of how to make other apps correctly depend on the database being up. But I'm still unclear on whether Docker, or Docker Compose, has a built in facility to *start the database itself* the moment any of the dependent services are started (which would obviously need to handle a state that the DB is not yet running). – Jeff Evans Aug 18 '15 at 21:15
  • Docker and docker-compose both have no clue about the things that run inside the containers. And it is not their business to know that – Thomasleveil Aug 18 '15 at 21:24
  • That is correct. However, docker-compose allows you to externally define those dependencies (via `links`, `external_links`, and the like). My question was intended in that context of modeling. – Jeff Evans Aug 18 '15 at 22:12
  • if you are to make different docker-compose config files, then you would either have to manually launch the db one and wait for it to be up, then manually launch the others. Or, you would have to come up with some sort of shell script to do the same thing for you. It does not change the problem: some way or the other one has to wait for the db to be up – Thomasleveil Aug 18 '15 at 22:24