I've been battling with this task all day.
I have a Django app. I use Celery for asynchronous tasks. Occasionally, I want to create a periodic task. The number of times that tasks will run unknown, but it will need to be deleted later. So the task could be like this:
@shared_task
def foobar_task(id):
if this_should_run:
do_task()
else:
PeriodicTask.objects.get(name='{} task'.format(id)).delete()
My app is running. I have celery beat running in a Docker container, run using using celery --app=myproject beat --loglevel=info --scheduler=django
. I have another container running the standard celery worker.
So now I want to dynamically create my periodic task. I have a view/API endpoint that triggers something like this:
schedule, _ = IntervalSchedule.objects.get_or_create(every=15, period=IntervalSchedule.SECONDS)
PeriodicTask.objects.create(interval=schedule,
name='{} task'.format(id),
task='myapp.tasks.foobar_task')
In the Django admin, I can see the periodic task has been created. However, watching the logs for both the celery container and celery beat container, nothing happens.
Why is celery beat not picking up that there's a new periodic task? I don't want to have to restart celery beat every time a new task is created or deleted.
Note: I am using Django 1.11.2, PostgreSQL, Celery 4.0.2, Django Celery Beat 1.0.1.