Generally it is recommended to run a single process per docker container. And that makes sense if you are trying to run a single web application that requires different kinds of tools.
For example, the open source web application kanboard
makes use of
- mysql
- apache
- php5
- memcache
Now if that was the only web application I was going to run, then it makes sense to run each tool in a separate container to take advantage of dockers one process per container.
But say, instead of running only one web application I wanted to run multiple web applications,
- kanboard
- etherpad
- plex
- owncloud
- dockuwiki
- discourse
Now how can I use docker to isolate those web applications? The reason I ask is because each application mentioned above might have its own,
- backend data store (mysql, postgres, sqlite)
- cache store (memcache, redis)
- concurrent task management (celery, queues, RQ, SHARQ)
- web server (nginx, apache)
- search server (lucene, sphinx, opensearchserver)
There are 2 ways to use docker to run those web applications. 2 ways that I know off,
- Run each application along with all of it's dependencies in a single container. one for kanboard, one for etherpad and so on.
- Adhere to dockers dictum of
one process per container
and create one for mysql, postgres, sqlite, memcache and so on and one for each application code itself and usedocker linking
to link the related containers together. This is more messy. Lot more organizing and management required.
My question is if there is any other way? And if there isn't which of the above options should I choose and why?
Or maybe am I using the wrong tool(docker containers) for the job? Perhaps there is another way accomplishing application isolation without using docker containers?