Celery with RabbitMQ: AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

Question:

I’m running the First Steps with Celery Tutorial.

We define the following task:

from celery import Celery

app = Celery('tasks', broker='amqp://guest@localhost//')

@app.task
def add(x, y):
    return x + y

Then call it:

>>> from tasks import add
>>> add.delay(4, 4)

But I get the following error:

AttributeError: 'DisabledBackend' object has no attribute '_get_task_meta_for'

I’m running both the celery worker and the rabbit-mq server. Rather strangely, celery worker reports the task as succeeding:

[2014-04-22 19:12:03,608: INFO/MainProcess] Task test_celery.add[168c7d96-e41a-41c9-80f5-50b24dcaff73] succeeded in 0.000435483998444s: 19 

Why isn’t this working?

Asked By: Casebash

||

Answers:

Just keep reading tutorial. It will be explained in Keep Results chapter.

To start Celery you need to provide just broker parameter, which is required to send messages about tasks. If you want to retrieve information about state and results returned by finished tasks you need to set backend parameter. You can find full list with description in Configuration docs: CELERY_RESULT_BACKEND.

Answered By: daniula

I suggest having a look at:
http://www.cnblogs.com/fangwenyu/p/3625830.html

There you will see that
instead of

app = Celery('tasks', broker='amqp://guest@localhost//')

you should be writing

app = Celery('tasks', backend='amqp', broker='amqp://guest@localhost//')

This is it.

Answered By: TorokLev

In case anyone made the same easy to make mistake as I did: The tutorial doesn’t say so explicitly, but the line

app = Celery('tasks', backend='rpc://', broker='amqp://')

is an EDIT of the line in your tasks.py file. Mine now reads:

app = Celery('tasks', backend='rpc://', broker='amqp://guest@localhost//')

When I run python from the command line I get:

$ python
>>> from tasks import add
>>> result = add.delay(4,50)
>>> result.ready()
>>> False

All tutorials should be easy to follow, even when a little drunk. So far this one doesn’t reach that bar.

Answered By: Diederik

In your project directory find the settings file.

Then run the below command in your terminal:

sudo vim settings.py

copy/paste the below config into your settings.py:

CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend'

Note: This is your backend for storing the messages in the queue if you are using django-celery package for your Django project.

Answered By: Carlisle

What is not clear by the tutorial is that the tasks.py module needs to be edited so that you change the line:

app = Celery('tasks', broker='pyamqp://guest@localhost//')

to include the RPC result backend:

app = Celery('tasks', backend='rpc://', broker='pyamqp://')

Once done, Ctrl + C the celery worker process and restart it:

celery -A tasks worker --loglevel=info

The tutorial is confusing in that we’re making the assumption that creation of the app object is done in the client testing session, which it is not.

Answered By: Chris Foote

I had the same issue, what resolved it for me was to import the celery file (celery.py) in the init function of you’re app with something like:

from .celery import CELERY_APP as celery_app

__all__ = ('celery_app',)

if you use a celery.py file as described here

Answered By: fsulser

Celery rely both on a backend AND a broker.
This solved it for me using only Redis:

app = Celery("tasks", backend='redis://localhost',broker="redis://localhost")

Remember to restart worker in your terminal after changing the config

Answered By: Punnerud

My case was simple – I used interactive Python console and Python cached imported module. I killed console and started it again – everything works as it should.

import celery


app = celery.Celery('tasks', broker='redis://localhost:6379',
                    backend='mongodb://localhost:27017/celery_tasks')

@app.task
def add(x, y):
    return x + y

In Python console.

>>> from tasks import add
>>> result = add.delay(4, 4)
>>> result.ready()
True
Answered By: Omony

Switching from Windows to Linux solved the issue for me
Windows is not guaranteed to work, it’s mentioned here

Answered By: FarisHijazi

I solved this error by adding app after taskID:

response = AsyncResult(taskID, app=celery_app)

where celery_app = Celery('ANYTHING', broker=BROKER_URL, backend=BACKEND_URL )

if you want to get the status of the celery task to know whether it is "PENDING","SUCCESS","FAILURE"

status = response.status
Answered By: Alankrith G
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.