2

Is it possible for django-celery-beat not to save tasks that are performed in a short time interval? By default, all results are saved to the Task Results table.

I cannot find this information on celeryproject webpage.

Alternatively, what should be the settings for postgres auto vacuum so that indexes don't take up that much disk space?

I want a simple solution. Overwriting django celery logic is not an option.

OceanFire
  • 504
  • 4
  • 11

2 Answers2

2

Is rpc backend and task_ignore_result answer your needs :

app = Celery('APP_NAME', backend='rpc://', broker=BROKER_URL)
# Keep result only if you really need them: task_ignore_result = False
# In all other cases it is better to have place somewhere in db
# Note that this means you can’t use AsyncResult to check if the task is ready, or get its return value.
app.conf.task_ignore_result = True

Some docs about backend here : https://docs.celeryproject.org/en/latest/userguide/tasks.html#rpc-result-backend-rabbitmq-qpid

MaximeK
  • 2,039
  • 1
  • 10
  • 16
2

I found a solution. It's simpler than I thought. If you want not to save the results to the database, just add in the decorator:

@shared_task (ignore_result = True)
OceanFire
  • 504
  • 4
  • 11