How to combine Celery with asyncio?

Question:

How can I create a wrapper that makes celery tasks look like asyncio.Task? Or is there a better way to integrate Celery with asyncio?

@asksol, the creator of Celery, said this::

It’s quite common to use Celery as a distributed layer on top of async I/O frameworks (top tip: routing CPU-bound tasks to a prefork worker means they will not block your event loop).

But I could not find any code examples specifically for asyncio framework.

Asked By: max

||

Answers:

EDIT: 01/12/2021 previous answer (find it at the bottom) didn’t age well therefore I added a combination of possible solutions that may satisfy those who still look on how to co-use asyncio and Celery

Lets quickly break up the use cases first (more in-depth analysis here: asyncio and coroutines vs task queues):

  • If the task is I/O bound then it tends to be better to use coroutines and asyncio.
  • If the task is CPU bound then it tends to be better to use Celery or other similar task management systems.

So it makes sense in the context of Python’s "Do one thing and do it well" to not try and mix asyncio and celery together.

BUT what happens in cases where we want to be able to run a method both asynchronously and as an async task? then we have some options to consider:

  • The best example that I was able to find is the following: https://johnfraney.ca/posts/2018/12/20/writing-unit-tests-celery-tasks-async-functions/ (and I just found out that it is @Franey’s response):

    1. Define your async method.

    2. Use asgiref‘s sync.async_to_sync module to wrap the async method and run it synchronously inside a celery task:

      # tasks.py
      import asyncio
      from asgiref.sync import async_to_sync
      from celery import Celery
      
      app = Celery('async_test', broker='a_broker_url_goes_here')
      
      async def return_hello():
          await asyncio.sleep(1)
          return 'hello'
      
      
      @app.task(name="sync_task")
      def sync_task():
          async_to_sync(return_hello)()
      
  • A use case that I came upon in a FastAPI application was the reverse of the previous example:

    1. An intense CPU bound process is hogging up the async endpoints.

    2. The solution is to refactor the async CPU bound process into a celery task and pass a task instance for execution from the Celery queue.

    3. A minimal example for visualization of that case:

      import asyncio
      import uvicorn
      
      from celery import Celery
      from fastapi import FastAPI
      
      app = FastAPI(title='Example')
      worker = Celery('worker', broker='a_broker_url_goes_here')
      
      @worker.task(name='cpu_boun')
      def cpu_bound_task():
          # Does stuff but let's simplify it
          print([n for n in range(1000)])
      
      @app.get('/calculate')
      async def calculate():
          cpu_bound_task.delay()
      
      if __name__ == "__main__":
          uvicorn.run('main:app', host='0.0.0.0', port=8000)
      
  • Another solution seems to be what @juanra and @danius are proposing in their answers, but we have to keep in mind that performance tends to take a hit when we intermix sync and async executions, thus those answers need monitoring before we can decide to use them in a prod environment.

Finally, there are some ready-made solutions, that I cannot recommend (because I have not used them myself) but I will list them here:

  • Celery Pool AsyncIO which seems to solve exactly what Celery 5.0 didn’t, but keep in mind that it seems a bit experimental (version 0.2.0 today 01/12/2021)
  • aiotasks claims to be "a Celery like task manager that distributes Asyncio coroutines" but seems a bit stale (latest commit around 2 years ago)

Well that didn’t age so well did it? Version 5.0 of Celery didn’t implement asyncio compatibility thus we cannot know when and if this will ever be implemented… Leaving this here for response legacy reasons (as it was the answer at the time) and for comment continuation.

That will be possible from Celery version 5.0 as stated on the official site:

http://docs.celeryproject.org/en/4.0/whatsnew-4.0.html#preface

  1. The next major version of Celery will support Python 3.5 only, where we are planning to take advantage of the new asyncio library.
  2. Dropping support for Python 2 will enable us to remove massive amounts of compatibility code, and going with Python 3.5 allows us to take advantage of typing, async/await, asyncio, and similar concepts there’s no alternative for in older versions.

The above was quoted from the previous link.

So the best thing to do is wait for version 5.0 to be distributed!

In the meantime, happy coding 🙂

Answered By: John Moutafis

You can wrap any blocking call into a Task using run_in_executor as described in documentation, I also added in the example a custom timeout:

def run_async_task(
    target,
    *args,
    timeout = 60,
    **keywords
) -> Future:
    loop = asyncio.get_event_loop()
    return asyncio.wait_for(
        loop.run_in_executor(
            executor,
            functools.partial(target, *args, **keywords)
        ),
        timeout=timeout,
        loop=loop
    )
loop = asyncio.get_event_loop()
async_result = loop.run_until_complete(
    run_async_task, your_task.delay, some_arg, some_karg="" 
)
result = loop.run_until_complete(
    run_async_task, async_result.result 
)
Answered By: danius

The cleanest way I’ve found to do this is to wrap the async function in asgiref.sync.async_to_sync (from asgiref):

from asgiref.sync import async_to_sync
from celery.task import periodic_task


async def return_hello():
    await sleep(1)
    return 'hello'


@periodic_task(
    run_every=2,
    name='return_hello',
)
def task_return_hello():
    async_to_sync(return_hello)()

I pulled this example from a blog post I wrote.

Answered By: Franey

This simple way worked fine for me:

import asyncio
from celery import Celery

app = Celery('tasks')

async def async_function(param1, param2):
    # more async stuff...
    pass

@app.task(name='tasks.task_name', queue='queue_name')
def task_name(param1, param2):
    asyncio.run(async_function(param1, param2))
Answered By: juanra

I solved problem by combining Celery and asyncio in the celery-pool-asyncio library.

Answered By: kai3341

Here is a simple helper that you can use to make a Celery task awaitable:

import asyncio
from asgiref.sync import sync_to_async

# Converts a Celery tasks to an async function
def task_to_async(task):
    async def wrapper(*args, **kwargs):
        delay = 0.1
        async_result = await sync_to_async(task.delay)(*args, **kwargs)
        while not async_result.ready():
            await asyncio.sleep(delay)
            delay = min(delay * 1.5, 2)  # exponential backoff, max 2 seconds
        return async_result.get()
    return wrapper

Like sync_to_async, it can be used as a direct wrapper:

@shared_task
def get_answer():
    sleep(10) # simulate long computation
    return 42    

result = await task_to_async(get_answer)()

…and as a decorator:

@task_to_async
@shared_task
def get_answer():
    sleep(10) # simulate long computation
    return 42    

result = await get_answer()

Of course, this is not a perfect solution since it relies on polling.
However, it should be a good workaround to call Celery tasks from Django async views until Celery officially provides a better solution.

EDIT 2021/03/02: added the call to sync_to_async to support eager mode.

Answered By: Benoit Blanchon

Here’s my implementation of Celery handling async coroutines when necessary:

Wrap the Celery class to extend its functionnality:

from celery import Celery
from inspect import isawaitable
import asyncio


class AsyncCelery(Celery):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.patch_task()

        if 'app' in kwargs:
            self.init_app(kwargs['app'])

    def patch_task(self):
        TaskBase = self.Task

        class ContextTask(TaskBase):
            abstract = True

            async def _run(self, *args, **kwargs):
                result = TaskBase.__call__(self, *args, **kwargs)
                if isawaitable(result):
                    await result

            def __call__(self, *args, **kwargs):
                asyncio.run(self._run(*args, **kwargs))

        self.Task = ContextTask

    def init_app(self, app):
        self.app = app

        conf = {}
        for key in app.config.keys():
            if key[0:7] == 'CELERY_':
                conf[key[7:].lower()] = app.config[key]

        if 'broker_transport_options' not in conf and conf.get('broker_url', '')[0:4] == 'sqs:':
            conf['broker_transport_options'] = {'region': 'eu-west-1'}

        self.config_from_object(conf)


celery = AsyncCelery()
Answered By: Cyril N.

A nice way to implement Celery with asyncio:

import asyncio
from celery import Celery

app = Celery()

async def async_function(param):
    print('do something')

@app.task()
def celery_task(param):
    loop = asyncio.get_event_loop()
    return loop.run_until_complete(async_function(param))
Answered By: Gustavo Bakker