- I have tests that I run locally using a
docker-compose
environment. - I would like to implement these tests as part of our CI using Jenkins with Kubernetes on Google Cloud (following this setup).
- I have been unsuccessful because docker-in-docker does not work.
It seems that right now there is no solution for this use-case. I have found other questions related to this issue; here, and here.
I am looking for solutions that will let me run docker-compose
. I have found solutions for running docker
, but not for running docker-compose
.
I am hoping someone else has had this use-case and found a solution.
Edit: Let me clarify my use-case:
- When I detect a valid trigger (ie: push to repo) I need to start a new job.
- I need to setup an environment with multiple dockers/instances (docker-compose).
- The instances on this environment need access to code from git (mount volumes/create new images with the data).
- I need to run tests in this environment.
- I need to then retrieve results from these instances (JUnit test results for Jenkins to parse).
The problems I am having are with 2, and 3.
For 2 there is a problem running this in parallel (more than one job) since the docker context is shared (docker-in-docker issues). If this is running on more than one node then i get clashes because of shared resources (ports for example). my workaround is to only limit it to one running instance and queue the rest (not ideal for CI)
For 3 there is a problem mounting volumes since the docker context is shared (docker-in-docker issues). I can not mount the code that I checkout in the job because it is not present on the host that is responsible for running the docker instances that I trigger. my workaround is to build a new image from my template and just copy the code into the new image and then use that for the test (this works, but means I need to use docker cp tricks to get data back out, which is also not ideal)