How do I use the @shared_task decorator for class based tasks?

Question:

As seen in the documentation the @shared_task decorator lets you create tasks without having any concrete app instance. The given examples show how to decorate a function based task.

Can you help me understand how to decorate a class based task?

Asked By: Juan Riaza

||

Answers:

The documentation you linked to says:

The @shared_task decorator lets you create tasks without having any concrete app instance:

As far as I can tell, the documentation is misleading, and should say:

The @shared_task decorator lets you create tasks that can be used by any app(s).

In fact, any Task must be attached to an app instance. My evidence comes from the celery source file celery/app/builtins.py:

def shared_task(constructor):
    """Decorator that specifies a function that generates a built-in task.

    The function will then be called for every new app instance created
    (lazily, so more exactly when the task registry for that app is needed).

    The function must take a single ``app`` argument.
    """
    _shared_tasks.add(constructor)
    return constructor

So it looks like you can use this decorator to create a task without having a concrete app instance, but in fact the decorated function MUST take an app argument – as the source comment says.

The next function follows:

def load_shared_tasks(app):
    """Create built-in tasks for an app instance."""
    constructors = set(_shared_tasks)
    for constructor in constructors:
        constructor(app)

You can confirm here that each function decorated by @shared_tasks will be invoked with an app argument.

Answered By: jdhildeb

Quoting Ask from celery-users thread where he explained difference between @task a @shared_task. Here is link to the thread

TL;DR; @shared_task will create the independent instance of the task for each app, making task reusable.

There is a difference between @task(shared=True) and @shared_task

The task decorator will share tasks between apps by default so that if you do:

app1 = Celery() 
@app1.task 
def test(): 
    pass 

app2 = Celery() 

the test task will be registered in both apps:

 assert app1.tasks[test.name] 
 assert app2.tasks[test.name] 

However, the name ‘test’ will always refer to the instance bound to the ‘app1’
app, so it will be configured using app1’s configuration:

assert test.app is app1 

The @shared_task decorator returns a proxy that always uses the task instance
in the current_app:

app1 = Celery() 

@shared_task 
def test(): 
    pass 
assert test.app is app1 


app2 = Celery() 
assert test.app is app2 

This makes the @shared_task decorator useful for libraries and reusable apps,
since they will not have access to the app of the user.

In addition the default Django example project defines the app instance
as part of the Django project:

from proj.celery import app

and it makes no sense for a Django reusable app to depend on the project module,
as then it would not be reusable anymore.

Answered By: Saurabh
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.