I'm working on trying to create a micro service system able to be deployed using docker, and I've run into a few conceptual road blocks that I'd love some help with. So each of my micro services are running on Python's Nameko framework with RabbitMQ as the message broker, and each of the services are in their own git repo. I'm trying to figure out how I can create a gitlab CI deployment system for each service such that when master is changed, for any perticular service, the container gets rebuilt and auto deployed to my docker swarm My current attempt at a gitlab CI config:
stage:
- build
- deploy
build:
image: docker
variables:
REG_HOST: 'local-registry-url'
script:
- docker login to local image repo
- docker build -t $REG_HOST/$SOME_TAG
- docker push $REG_HOST/$SOME_TAG
deploy:
image: docker
script:
- docker login to local image repo
- docker stack deploy --with-registry-auth --compose-file=docker-stack-compose.yml
only:
- master
I am creating a new docker-stack-compose.yml
per service, but I'm not sure how to define a dependency that all of my services rely on, such as rabbitMQ without redefining it in each of my docker-stack-compose.yml
and would that even work? Also how would I put them all on the same network?
This is what my current work in progress compose file looks like for a test service:
version: '3'
services:
test:
container_name: io-test
image: local-registry-url/service-name
tty: true
ports:
- '8000:8000'
depends_on:
- rabbitmq
networks:
- service-net
deploy:
mode: replicated
replicas: 2
networks:
default:
external: service-net
Any help with this would be greatly appreciated.