3

Not duplicate of Celery: No Result Backend Configured? because SQS is used.

Keep getting the following error:

No result backend is configured. Please see the documentation for more information.

My production settings are the following:

CELERY_BROKER_URL = 'sqs://%s:%s@' % (
    urllib.parse.quote(env.str('TASK_QUEUE_USER_ID'), safe=''),
    urllib.parse.quote(env.str('TASK_QUEUE_USER_SECRET'), safe=''))

BROKER_URL = CELERY_BROKER_URL
CELERY_ENABLE_REMOTE_CONTROL = False
CELERY_RESULT_BACKEND = None # Disabling the results backend
RESULT_BACKEND = None # Disabling the results backend

CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_DEFAULT_QUEUE = 'async_tasks'
SQS_QUEUE_NAME = 'async_tasks'
CELERY_ENABLE_REMOTE_CONTROL = False 
CELERY_SEND_EVENTS = False


CELERY_BROKER_TRANSPORT_OPTIONS = {
    'region': 'eu-west-2',
    'polling_interval': 3,
    'visibility_timeout': 3600,
}
CELERY_SEND_TASK_ERROR_EMAILS = True

#
# https://stackoverflow.com/questions/8048556/celery-with-amazon-sqs#8567665
#
CELERY_BROKER_TRANSPORT = 'sqs'
BROKER_TRANSPORT = 'sqs'

Running celery from the command line: DJANGO_ENV=production celery -A async_tasks worker -l info connects to SQS and polls, but when I try to do a demo call from the command line DJANGO_ENV=production python manage.py check_async:

from django.core.management.base import BaseCommand, CommandError

import async_tasks.tasks as tasks


class Command(BaseCommand):
    help = 'Check if infrastructure for async tasks has been setup correctly.'

    def handle(self, *args, **options):
        try:
            print('Sending async request.')
            t = tasks.add.apply_async((2, 4))
            out = t.get(timeout=1)
            print(out)
            print(t.status)
        except Exception as e:
            print(e)
            raise CommandError('Error occured')

I get the error above. Have tried in development machine with redis and everything works well.

Any ideas?

John Rotenstein
  • 241,921
  • 22
  • 380
  • 470
Dimitrios Mistriotis
  • 2,626
  • 3
  • 28
  • 45

2 Answers2

2

You need a Celery Result Backend configured to be able to store and collect task results. Using Celery with an SQS broker w/o a result backend is ok for "fire and forget" patterns, but it's not enough if you want to be able to access the results of your tasks through methods like get().

Sean Azlin
  • 886
  • 7
  • 21
2

Maybe this will help someone. The answer above is correct, but if you still want to use Django, SQS and Celery and still want to see the results you can use Django's ORM or Cache Framework as a backend by using the django-celery-results library.

Django-celery-results

Celery Documentation - ORM Cache Framework

L. P.
  • 165
  • 1
  • 19
  • 2
    While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. – csabinho Nov 08 '19 at 22:03
  • Are you saying django-celery-results library works with SQS? On the [Celery SQS page](https://docs.celeryproject.org/en/stable/getting-started/backends-and-brokers/sqs.html), it suggests that no results backend exists if you use SQS? I'm so confused now.. – Jarad Oct 04 '21 at 19:39