1

I am using recent versions of Celery and Flower and Redis, but I cannot understand what is happening. I get all the results to work and the data is coming back correctly, but when I view the Celery Tasks with Flower, I see all the tasks and the results.

My task has the following decorator:

@celery.task(ignore_result=True, bind=True)
def perform_long_task(self, urls):

I've tried calling get(), forget(), etc but when I look at Flower the results are always there.

I've also tried:

CELERY_TASK_RESULT_EXPIRES=10

I launch the task with:

celery worker -l info -B -A test.api.tasks

I've tried:

class CeleryConfig:
    CELERYBEAT_SCHEDULE = {
        'check-every-minute': {
                    'task': 'celery.backend_cleanup',
                    'schedule': crontab(hour="*/1"),
        }
    }

I cannot get the taks to not show in Flower, so I am thinking that the tasks must still be there (taking up memory).

Is this true? Any thought to get them to disappear?

Thanks!

mrehan
  • 1,122
  • 9
  • 18
mattjvincent
  • 868
  • 2
  • 11
  • 28
  • Recently I faced a similar problem with the `task_results` in celery. In my case the GC never cleaned up the memory when a `GroupResult` and `AsyncResult` was returned. Calling `.forget()` on them worked (meta data got deleted from my Redis), but the `GroupResult` was still persistent in memory. The only solution I have found, was to not use a `results_backend` and built my own implementation of the `task_result`. I'm not sure if you need the `task_results`. If you do you can store the goal of your task instead of the `task_result` https://stackoverflow.com/a/38267978/2525104 – Kenjin Jul 09 '18 at 10:37

0 Answers0