1

I know there are a ton of articles, blogs, and SO questions about running python applications inside a docker container. But I am looking for information on doing the opposite. I have a distributed application with a lot of independent components, and I have put each of these components inside a docker container(s) that I can run by manually via

docker run -d <MY_IMAGE_ID> mycommand.py

But I am trying to find a way (a pythonic, object-oriented way) of running these containerized applications from a python script on my "master" host machine.

I can obviously wrap the command line calls into a subprocess.Popen(), but I'm looking to see if something a bit more managed exists.

I have seen docker-py but I'm not sure if this is what I need or not; I can't find a way to use it to simply run a container, capture output, etc.

If anyone has any experience with this, I would appreciate any pointers.

Brett
  • 11,637
  • 34
  • 127
  • 213
  • I am not certain if understand your question, but it make sense. Recently, I started applying my python projects inside docker containers. I built a base python machine with virtualenv support and some development or production support. Sometimes, I experimented use supervisord inside docker. It seems a little strange on docker community. But, using it in some of my projects, it didn't turn into a problem et al. It is working fine and easy to configure a kind of independent and really small services connected by a message broker, like rabbitmq or mosquitto. This makes sense for your question? – Andre Pastore Jan 22 '16 at 22:14
  • @apast Sort of. I have thought about using a messaging broker to talk to and from the containers. This seems overkill since I really just want a python script to basically run an application inside a docker container and capture its stdout. – Brett Jan 22 '16 at 22:17
  • For small applications and interprocess communication, a good choice is zeromq or nanomsg. You can use it as a library, and it is a really good for small applications. If you will not split jobs each one in a separated container, you can manage it using supervisord inside the single container. Maybe it is easy to put things running fast – Andre Pastore Jan 22 '16 at 22:24
  • This feels like something fabric (http://www.fabfile.org/) would be good for, but I haven't used it with docker – Foon Jan 22 '16 at 23:20
  • Look at the list of orchestration tools: [How to scale Docker containers in production](http://stackoverflow.com/q/18285212/4279). – jfs Jan 23 '16 at 08:30

0 Answers0