3

We are trying to run two apps via docker-compose. These apps are (obviously) in separate folders, each of them having their own docker-compose.yml . On the filesystem it looks like this:

dir/app1/
  -...
  -docker-compose.yml
dir/app2/
  -...
  -docker-compose.yml

Now we need a way to compose these guys together for they have some nitty-gritty integration via http.

The issue with default docker-compose behaviour is that if treats all relative paths with respect to folder it is being run at. So if you go to dir from the example above and run

docker-compose up -f app1/docker-compose.yml -f app2/docker-compose.yml

you'll end up being out of luck if any of your docker-compose.yml's uses relative paths to env files or whatever.

Here's the list of ways that actually work, but have their drawbacks:

  1. Run those apps separately, and use networks.

It is described in full at Communication between multiple docker-compose projects

I've tested that just now, and it works. Drawbacks:

  • you have to mention network in docker-compose.yml and push that to repository some day, rendering entire app being un-runnable without the app that publishes the network.

  • you have to come up with some clever way for those apps to actually wait for each other

2 Use absolute paths. Well, it is just bad and does not need any elaboration.

3 Expose the ports you need on host machine and make them talk to host without knowing a thing about each other. That is too, obviously, meh.

So, the question is: how can one manage the task with just docker-compose ?

Community
  • 1
  • 1
Stepan Salin
  • 179
  • 1
  • 1
  • 10
  • 1
    Why not just use one compose file to run both of your applications? You can use `links` to control the dependency of them. – Haoming Zhang Oct 08 '16 at 22:31
  • 1
    So, the basic idea is to write a new `docker-compose.yml` one level above the apps, and handle all paths and links there? that could work, but it is not clear in which apps repo it sould belong. But still, it is the least dirty way, thanks for the suggestion! – Stepan Salin Oct 08 '16 at 22:35
  • 1
    I use a similar approach, with a separate project to pull the individual pieces together. I use Git submodules to clone the `app1` and `app2` projects into a common project, and then use `docker-compose.yaml` there that imports the services from the individual apps. Works really well... – nwinkler Oct 10 '16 at 10:29

1 Answers1

0

Thanks to everyone for your feedback. Within our team we have agreed to the following solution:

Use networks & override

Long story short, your original docker-compose.yml's should not change a bit. All you have to do is to make docker-compose.override.yml near it, which publishes the network and hooks your services into it.

So, whoever wants to have a standalone app runs

docker-compose -f docker-compose.yml up

But when you need to run apps side-by-side and communicating with each other, you should go with

docker-compose -f docker-compose.yml -f docker-compose.override.yml up
Stepan Salin
  • 179
  • 1
  • 1
  • 10