KeyError Received unregistered task of type '' on celery while task is registered

Question:

I’m a bit new in celery configs.

I have a task named myapp.tasks.my_task for example.

I can see myapp.tasks.my_task in registered tasks of celery when I use celery inspect registered. doesn’t it mean that the task is successfully registered? why it raises the following error for it:

KeyError celery.worker.consumer.consumer in on_task_received

Received unregistered task of type 'my_app.tasks.my_task'.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you're using relative imports?

Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.

The full contents of the message body was:
'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (77b)

there are also other tasks in my_app.tasks and they work correctly but only this task does not work and gets KeyError:

@shared_task(queue='celery')
def other_task():
   """ WORKS """
   ...

@shared_task(queue='celery')
def my_task():
   """ DOES NOT WORK """
   ...

Asked By: Ashkan Khademian

||

Answers:

It means that Celery can’t find the implementation of the task my_app.tasks.my_task when it was called. Some possible solutions you may want to look at:

Possible Solution 1:

You probably haven’t configured correctly either:

  • Celery imports e.g. celery_app.conf.update(imports=['my_app.tasks']) or celery_app.conf.imports = ['my_app.tasks']
  • Or Celery include (example) e.g. celery_app = Celery(..., include=['my_app.tasks'])

Note: If in a Django application, this can be skipped if already using celery_app.autodiscover_tasks() since the tasks are automatically discovered in the location ./<app_name>/tasks.py

Possible Solution 2:

If you are only importing my_app e.g. celery_app.conf.update(imports=['my_app']) then I assume you have a file my_app/__init__.py Make sure that inside that file, it imports the task my_app.tasks.my_task along with my_app.tasks.other_task so that the celery app knows that such task exists.

# Contents of my_app/__init__.py
from my_app.tasks import (
    my_task,
    other_task,
)

Possible Solution 3:

In case the my_task was just newly added (whereas other_task was already an old existing task), you might not have restarted the celery worker yet to see the new task. Try restarting the worker.

Another solution can be adding a name param in the shared_task decorator, i.e.,

@shared_task(queue='celery', name='other_task')
def other_task():
   """ WORKS """
   ...

@shared_task(queue='celery', name='my_task')
def my_task():
   """ DOES NOT WORK """
   ...

I see this in a full demo project https://github.com/melikesofta/django-dynamic-periodic-tasks, and the related code can be find here.

Answered By: Dzhuang

I could solve the error by just restarting Celery. I use Celery 5.1.2:

celery -A core worker --pool=solo -l info
Answered By: Kai – Kazuya Ito

Back in the day, when I faced the problem a senior solved it for me in a mushroom management way unfortunately (see more about this anti-pattern here).
I came back to this problem recently to figure out the solution in our own project domain.

As Niel pointed in his/her solution, we were using celery_app.autodiscover_tasks() in our project and in that case we should import my_task in __init__.py of tasks package like bellow.

from .some_tasks_file import my_task

Also, we used celery beat and the task defined inside app.conf.beat_schedule must have exact path to the function like bellow (even though the function is imported in __init__.py of the tasks package).

app.conf.beat_schedule = {
    'MY_TASK': {
        'task': 'myapp.tasks.some_tasks_file.my_task',
        'schedule': 60,  # every minute
    },
}

Hope this would help people with the same celery configuration and problem.

Answered By: Ashkan Khademian