How to keep multiple independent celery queues?

Question:

I’m trying to keep multiple celery queues with different tasks and workers in the same redis database. Really just a convenience issue of only wanting one redis server rather than two on my machine.

I followed the celery tutorial docs verbatim, as it as the only way to get it to work for me. Now when I try to duplicate everything with slightly tweaked names/queues, it keeps erroring out.

Note – I’m a newish to Python and Celery, which is obviously part of the problem. I’m not sure which parts are named “task/tasks” as a name vs special words.

My condensed version of docs:
Run celery -A tasks worker to spawn the workers.
tasks.py contains task code with celery = Celery('tasks', broker='redis://localhost') to connect to Celery and @task() above my functions that I want to delay.

Within my program for queueing tasks…

from tasks import do_work
do_work.delay()

So given all of the above, what are the steps I need to take to turn this into two types of tasks that run independently on separate queues and workers? For example, blue_tasks and red_tasks?

I’ve tried changing all instances of tasks to blue_tasks or red_tasks. However, when I queue blue_tasks, the red_tasks workers I’ve started up start trying to work on them.

I read about default queues and such, so I tried this code, which didn’t work:

CELERY_DEFAULT_QUEUE = 'red'
CELERY_QUEUES = (
    Queue('red', Exchange('red'), routing_key='red'),
)

As a side note, I don’t understand why celery worker errors out with celery attempting to connect to a default amqp instance, while celery -A tasks worker tells celery to connect to Redis. What task code is celery worker attempting to run on the worker if nothing has been specified?

Asked By: jwoww

||

Answers:

By default everything goes into a default queue named celery (and this is what celery worker will process if no queue is specified)

So say you have your do_work task function in django_project_root/myapp/tasks.py.

You could configure the do_work task to live in it’s own queue like so:

CELERY_ROUTES = {
    'myproject.tasks.do_work': {'queue': 'red'},
}

Then run a worker using celery worker -Q red and it will only process things in that queue (another worker invoked with celery worker will only pickup things in the default queue)

The task routing section in the documentation should explain all.

Answered By: dbr

To link to different queue dynamically, follow the below steps:

1) Specify the name of the queue with the ‘queue’ attribute

celery.send_task('job1', args=[], kwargs={}, queue='queue_name_1')
celery.send_task('job1', args=[], kwargs={}, queue='queue_name_2')

(Here a particular job uses two queues)

2) Add the following entry in the configuration file

CELERY_CREATE_MISSING_QUEUES = True

3) While starting the worker, use -Q to specify the queue name’ from which the jobs to be consumed

celery -A proj worker -l info -Q queue1 
celery -A proj worker -l info -Q queue2
Answered By: josepainumkal
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.