91

I'm running the First Steps with Celery Tutorial.

We define the following task:

from celery import Celery

app = Celery('tasks', broker='amqp://guest@localhost//')

@app.task
def add(x, y):
    return x + y

Then call it:

>>> from tasks import add
>>> add.delay(4, 4)

But I get the following error:

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

I'm running both the celery worker and the rabbit-mq server. Rather strangely, celery worker reports the task as succeeding:

[2014-04-22 19:12:03,608: INFO/MainProcess] Task test_celery.add[168c7d96-e41a-41c9-80f5-50b24dcaff73] succeeded in 0.000435483998444s: 19 

Why isn't this working?

Casebash
  • 114,675
  • 90
  • 247
  • 350
  • 43
    As a new user of Celery and RabbitMQ (or any library you want to learn) seeing errors when following a tutorial doesn't inspire confidence in the quality of the software. It is just plain frustrating. I want to learn how to use your library, not its workarounds. – Diederik Jul 28 '15 at 18:55

10 Answers10

65

Just keep reading tutorial. It will be explained in Keep Results chapter.

To start Celery you need to provide just broker parameter, which is required to send messages about tasks. If you want to retrieve information about state and results returned by finished tasks you need to set backend parameter. You can find full list with description in Configuration docs: CELERY_RESULT_BACKEND.

daniula
  • 6,898
  • 4
  • 32
  • 49
  • 27
    I didn't have any trouble following the tutorial, but still ran into this error and had a hard time correcting it. I was running Celery in one Ubuntu terminal window, and the Python interpreter in a second. In the first window, I added backend='rpc://' and restarted Celery. But Python, in the second window, wasn't aware of this change. After I pressed Ctrl+d to kill Python, and started Python again, it worked fine. – Steve Saporta Oct 13 '16 at 19:12
  • I get a 404 with the last link in your answer. – Bryan Oakley Jan 17 '17 at 17:02
  • @BryanOakley I've updated link. However, Celery v4 has changed settings here, so proceed with caution. – daniula Jan 17 '17 at 20:08
  • @SteveSaporta Very helpful comment! This little details should be mentioned in the documentation. – Kostas Demiris May 03 '17 at 09:55
  • 1
    The problem here is that you *shouldn't* need to specify a `result_backend` to simply return an `AsyncResult` instance. (This is what is returned by `.delay()`.) The `result_backend` should only be necessary to look at attributes of that result, such as `.status`. – Brad Solomon Apr 19 '19 at 16:07
  • 1
    I use Redis as broker, and this problem doesn't present itself. (I can run/call tasks fine without a `result_backend`.) But if this is still the case for amqp, I would call that a bug. Calling a task itself shouldn't require a `result_backend` specified. – Brad Solomon Apr 19 '19 at 16:08
  • for celery backend you should use redis, because rabbitmq is not supported as a backend https://github.com/celery/celery/issues/6384 – Linh Nguyen Dec 30 '22 at 09:43
47

I suggest having a look at: http://www.cnblogs.com/fangwenyu/p/3625830.html

There you will see that instead of

app = Celery('tasks', broker='amqp://guest@localhost//')

you should be writing

app = Celery('tasks', backend='amqp', broker='amqp://guest@localhost//')

This is it.

TorokLev
  • 769
  • 6
  • 8
26

In case anyone made the same easy to make mistake as I did: The tutorial doesn't say so explicitly, but the line

app = Celery('tasks', backend='rpc://', broker='amqp://')

is an EDIT of the line in your tasks.py file. Mine now reads:

app = Celery('tasks', backend='rpc://', broker='amqp://guest@localhost//')

When I run python from the command line I get:

$ python
>>> from tasks import add
>>> result = add.delay(4,50)
>>> result.ready()
>>> False

All tutorials should be easy to follow, even when a little drunk. So far this one doesn't reach that bar.

Diederik
  • 5,536
  • 3
  • 44
  • 60
  • 5
    well, when you follow the tutorial, pls remember that after editing the tasks.py you must also reimport the add function from tasks module! Basically despite correct backend assignment in add() I kept getting this error until I quit Python in the console (>>>quit()), got back to it ($python) and retyped from tasks import add. – fanny Nov 07 '17 at 18:13
  • Aggree, a bad tutorial. – Alex Apr 24 '19 at 16:15
  • Well, it still is confusing after +5 years! I just opened up a [pull-request](https://github.com/celery/celery/pull/6979) and added a new sentence to fix this – Pedram Oct 03 '21 at 08:39
6

What is not clear by the tutorial is that the tasks.py module needs to be edited so that you change the line:

app = Celery('tasks', broker='pyamqp://guest@localhost//')

to include the RPC result backend:

app = Celery('tasks', backend='rpc://', broker='pyamqp://')

Once done, Ctrl + C the celery worker process and restart it:

celery -A tasks worker --loglevel=info

The tutorial is confusing in that we're making the assumption that creation of the app object is done in the client testing session, which it is not.

barbsan
  • 3,418
  • 11
  • 21
  • 28
Chris Foote
  • 61
  • 1
  • 2
3

In your project directory find the settings file.

Then run the below command in your terminal:

sudo vim settings.py

copy/paste the below config into your settings.py:

CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'

Note: This is your backend for storing the messages in the queue if you are using django-celery package for your Django project.

Javad
  • 2,033
  • 3
  • 13
  • 23
Carlisle
  • 53
  • 1
  • 12
  • I had this issue when removing backend_result=rpc (i.e. relying on the default result backend, which is none), and then repeatedly calling result.ready() – axd Feb 09 '18 at 10:43
1

Celery rely both on a backend AND a broker. This solved it for me using only Redis:

app = Celery("tasks", backend='redis://localhost',broker="redis://localhost")

Remember to restart worker in your terminal after changing the config

Punnerud
  • 7,195
  • 2
  • 54
  • 44
1

I solved this error by adding app after taskID:

response = AsyncResult(taskID, app=celery_app)

where celery_app = Celery('ANYTHING', broker=BROKER_URL, backend=BACKEND_URL )

if you want to get the status of the celery task to know whether it is "PENDING","SUCCESS","FAILURE"

status = response.status
0

My case was simple - I used interactive Python console and Python cached imported module. I killed console and started it again - everything works as it should.

import celery


app = celery.Celery('tasks', broker='redis://localhost:6379',
                    backend='mongodb://localhost:27017/celery_tasks')

@app.task
def add(x, y):
    return x + y

In Python console.

>>> from tasks import add
>>> result = add.delay(4, 4)
>>> result.ready()
True
Omony
  • 31
  • 1
  • 2
0

Switching from Windows to Linux solved the issue for me Windows is not guaranteed to work, it's mentioned here

FarisHijazi
  • 558
  • 4
  • 9
-2

I had the same issue, what resolved it for me was to import the celery file (celery.py) in the init function of you're app with something like:

from .celery import CELERY_APP as celery_app

__all__ = ('celery_app',)

if you use a celery.py file as described here

fsulser
  • 853
  • 2
  • 10
  • 34