0

I have 2 programs that need to be run periodically.

simple_job_1.py:

from datetime import datetime
import time

print("Starting job 1 ... ", datetime.now())
print("Doing stuff in job 1 for 20 seconds .........")
time.sleep(20)
print("Stopping job 1 ... ", datetime.now())

simple_job_2.py:

from datetime import datetime
import time

print("Starting job 2 ... ", datetime.now())
print("Doing stuff in job 2 for 5 seconds .........")
time.sleep(5)
print("Stopping job 2 ... ", datetime.now())

And I have created 2 docker images by building following Dockerfile's:

For job1:

FROM python:3

# Create a folder inside the container
RUN mkdir /home/TestProj

# Copy everything from the current folder in host to the dest folder in container
COPY . /home/TestProj

WORKDIR /home/TestProj

COPY . .

CMD ["python", "simple_job_1.py"]

For job2:

FROM python:3

# Create a folder inside the container
RUN mkdir /home/TestProj

# Copy everything from the current folder in host to the dest folder in container
COPY . /home/TestProj

WORKDIR /home/TestProj

COPY . .

CMD ["python", "simple_job_2.py"]

Here is how I build these container images:

 docker build -t simple_job_1:1.0 .
 docker build -t simple_job_2:1.0 .

And here is my docker-compose yaml file:

simple_compose.yaml:

version: "3.9"
services:
  job1:
    image: simple_job_1:1.0
  job2:
    image: simple_job_2:1.0

Q1) I need to run this compose - say - every 10th minute as a cronjob. How can I achieve it? I know that it is possible to run containers as cronjobs but is it possible to do that with docker compose?

Q2) Is it possible to run the docker-compose as a cronjob in Google Cloud GKE?

Q3) How can I make sure that job2 only starts after job1 completes?


ADDENDUM: Here is an example of cronjob spec for running containers as cronjobs in GKE:

# cronjob.yaml
apiVersion: batch/v1beta1
kind: CronJob
metadata:
  name: simple-job-cj-1
spec:
  schedule: "0 8 * * *"
  concurrencyPolicy: Allow
  startingDeadlineSeconds: 100
  suspend: false
  successfulJobsHistoryLimit: 3
  failedJobsHistoryLimit: 1
  jobTemplate:
    spec:
      template:
        spec:
          containers:
          - name: simple-job-cj-1
            image: simple_job
          restartPolicy: OnFailure

But please note that this is running a given container. By not being an expert in this field, I guess I can define multiple containers under the containers: section in the spec: above, which probably(?) means I do not need to use docker-compose then?? But if that is the case, how can I make sure that job2 only starts after job1 completes running?

edn
  • 1,981
  • 3
  • 26
  • 56
  • 1
    sure it will work – Lawrence Cherone Feb 01 '22 at 10:13
  • @LawrenceCherone Thank you for your answer. I would though appreciate if you could show how to do that as well, if you have the possibility.. :) – edn Feb 01 '22 at 10:14
  • are you stuck on the cronjob, making or adding it? – Lawrence Cherone Feb 01 '22 at 10:16
  • @LawrenceCherone I added a 3rd question to my question above and an addendum by providing more information. Do you think you can help me to understand it better with this additional info? If I can list my containers in the cronjob.yaml, how can I make sure that job2 only runs after job1 completes? Since there is this dependency, I thought I probably need to solve this with docker compose but I maybe don't need to...? – edn Feb 01 '22 at 10:25
  • I assume that would be possible to create several cronjob resources using the same container images but different command and args for each job and the use spec.concurrencyPolicy Forbid and spec.startingDeadlineSeconds. – J.Vander Feb 02 '22 at 12:24
  • If you really just need to run two jobs sequentially, have a look at https://stackoverflow.com/a/46880653/5529712. Basically you run the first job as an init container. – Gari Singh Feb 03 '22 at 10:04
  • For more advanced cases, https://github.com/argoproj/argo-workflows might work as well – Gari Singh Feb 03 '22 at 10:07

0 Answers0