0

I am trying to run a python script recurringly that sends out emails. The script retrieves information about emails etc from the database that my application's backend API uses. This backend is already deployed as a service in my ECS cluster. I am trying to set up a scheduled task but from what I understand it creates a new docker instance in order to run it. Is there a way to run this task in the already existing docker instance that is active at all times and acts as the API server.

I tried to create a scheduled task but from what it looks like it creates a new docker instance every time it runs and as the script is a short and nondemanding one this seems wasteful. Also I am not sure about Lambda functions as I need to access the database inside the ECS container.

  • It sounds like you don't want to use any AWS scheduling feature, and just want to run a cron job inside the docker container directly. In that case, try searching for "Run cron in docker" and look at solutions like this one: https://stackoverflow.com/questions/37458287/how-to-run-a-cron-job-inside-a-docker-container – Mark B Jun 28 '23 at 13:08

1 Answers1

1

Normally, I would say this calls for simply using cron on your host, but its not best practice to run multiple processes in a docker container.

What you could look at doing is using EventBridge to call an endpoint on your host that triggers the script/process on a schedule to send out your emails.

CDC4
  • 29
  • 3