2

I've been working around this problem for about 10 hours since morning but couldn't find any solutions to that.

I'm learning to use Celery. I want to use it in one of my django projects. I followed the official tutorials but I can't get the results of my tasks back.

no matter the backend is configured or not, I always get this error when trying to access the results:

Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\result.py", line 223, in get
    return self.backend.wait_for_pending(
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\backends\base.py", line 698, in wait_for_pending
    meta = self.wait_for(
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\backends\base.py", line 1029, in _is_disabled
    raise NotImplementedError(E_NO_BACKEND.strip())
NotImplementedError: No result backend is configured.
Please see the documentation for more information.

Here is my folder structure:

|- task_exe
   |- celery.py
   |- settings.py
   |- ...
|- tadder
   |- tasks.py
   |- ...
|- env
|- manage.py
|- ...

this is my celery.py file:

from __future__ import absolute_import, unicode_literals
from celery import Celery
from tadder.tasks import add

import os
os.environ.setdefault("DJANGO_SETTINGS_MODULE", 'task_exe.settings')
app = Celery("task_exe", backend='rpc://' , broker='amqp://localhost')
app.config_from_object('django.conf:settings', namespace="CELERY")

app.autodiscover_tasks()

this is my tasks.py file:

from __future__ import absolute_import, unicode_literals
from celery import shared_task

@shared_task
def add(a, b): return a + b

and this is the command I use to start a celery worker: (using windows 10 32 bits)

path/to/my_django_project_where_you_can_find_manage.py> celery -A task_exe workter -l info --pool=solo

when I register a task, it is received and executed by celery and I can see the results in the output console. But when I try to use result.get() to get my results, I get the error that was noted above.

I've also tried to change the backend. I changed the backend to "db+sqlite://result.sqlite3" and a result.sqlite3 file was created when I started a celery worker. I also found the id of tasks inside of the resulting .sqlite3 file but it didn't change anything. I'm still getting the error.

here are the commands that I use to register the tasks and retrieve the results (output from django's interactive shell (manage.py shell)):

>>> from tadder.tasks import add  
>>> r = add.delay(10, 20)
>>> r.status
Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\result.py", line 477, in state
    return self._get_task_meta()['status']
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\result.py", line 416, in _get_task_meta  
    return self._maybe_set_cache(self.backend.get_task_meta(self.id))
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\backends\base.py", line 558, in get_task_meta
    meta = self._get_task_meta_for(task_id)
AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'
>>> r.get()
Traceback (most recent call last):
  File "<console>", line 1, in <module>
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\result.py", line 223, in get
    return self.backend.wait_for_pending(
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\backends\base.py", line 698, in wait_for_pending
    meta = self.wait_for(
  File "G:\progs\python\Django-projects\celery_tutorial\env\lib\site-packages\celery\backends\base.py", line 1029, in _is_disabled
    raise NotImplementedError(E_NO_BACKEND.strip())
NotImplementedError: No result backend is configured.
Please see the documentation for more information.
>>>

and here is celery worker console's output:

-------------- celery@DESKTOP-N6RJF73 v5.1.1 (sun-harmonics)   
--- ***** -----
-- ******* ---- Windows-10-10.0.19041-SP0 2021-06-20 17:18:15   
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         task_exe:0x35c2778
- ** ---------- .> transport:   amqp://guest:**@localhost:5672//
- ** ---------- .> results:     rpc://
- *** --- * --- .> concurrency: 2 (solo)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
 -------------- [queues]
                .> celery           exchange=celery(direct) key=celery


[tasks]
  . tadder.tasks.add

[2021-06-20 17:18:15,954: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2021-06-20 17:18:15,990: INFO/MainProcess] mingle: searching for neighbors
[2021-06-20 17:18:17,044: INFO/MainProcess] mingle: all alone
[2021-06-20 17:18:17,101: WARNING/MainProcess] g:\progs\python\django-projects\celery_tutorial\env\lib\site-packages\celery\fixups\django.py:20xups\django.py:203: UserWarning: Using settings.DEBUG leads to a memory
            leak, never use this setting in production environments!
  warnings.warn('''Using settings.DEBUG leads to a memory

[2021-06-20 17:18:17,102: INFO/MainProcess] celery@DESKTOP-N6RJF73 ready.
[2021-06-20 17:19:35,297: INFO/MainProcess] Task tadder.tasks.add[77f22b77-e7ab-4a67-b12c-b8ebc282b629] received
[2021-06-20 17:19:40,170: INFO/MainProcess] Task tadder.tasks.add[77f22b77-e7ab-4a67-b12c-b8ebc282b629] succeeded in 0.0s: 30 
[2021-06-20 17:32:16,356: INFO/MainProcess] Task tadder.tasks.add[cf8351d7-c8df-47e1-826d-26f6dcb007ac] received
[2021-06-20 17:32:16,358: INFO/MainProcess] Task tadder.tasks.add[cf8351d7-c8df-47e1-826d-26f6dcb007ac] succeeded in 0.0s: 30

Can anyone help me to get this shit to work?!

A.Mohammadi
  • 304
  • 4
  • 16

1 Answers1

1

How to run celery on windows?
I always had issues with Celery in Windows even though technically you can run it, by using the --pool=solo command you have alredy indicated. The document is a bit misleading in this aspect and I was always getting inconsistant functionality. I ended up getting WSL for Windows, ubuntu 20 and I run redis (or rabbitmq) all day with processes and threads and no issues. Plus it more simulates a production environemnt since you most likely wont be running just one worker.

This is my example config for testing

app = Celery ('app', broker="redis://", backend="redis://", include=['app.tasks'])

Apparently support was removed in 4.0 but re-added in 4.3. Im sure someone with a windows 10 machine, running celery on windows my be able to chime in.

enjoi4life411
  • 568
  • 5
  • 11