How to stop celery worker process

Question:

I have a Django project on an Ubuntu EC2 node, which I have been using to set up an asynchronous using Celery.

I am following this along with the docs.

I’ve been able to get a basic task working at the command line, using:

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ celery --app=myproject.celery:app worker --loglevel=INFO

To start a worker. I have since made some changes to the Python, but realized that I need to restart a worker.

From the command line, I’ve tried:

 ps auxww | grep 'celery worker' | awk '{print $2}' | xargs kill -9

But I can see that the worker is still running.

How can I kill it?

edit:

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ sudo ps auxww | grep celeryd | grep -v "grep" | awk '{print $2}' | sudo xargs kill -HUP
kill: invalid argument H

Usage:
 kill [options] <pid> [...]

Options:
 <pid> [...]            send signal to every <pid> listed
 -<signal>, -s, --signal <signal>
                        specify the <signal> to be sent
 -l, --list=[<signal>]  list all signal names, or convert one to a name
 -L, --table            list all signal names in a nice table

 -h, --help     display this help and exit
 -V, --version  output version information and exit

For more details see kill(1).

edit 2:

(env1)ubuntu@ip-172-31-22-65:~/projects/tp$ ps aux|grep celery
ubuntu    9756  0.0  3.4 100868 35508 pts/6    S+   15:49   0:07 /home/ubuntu/.virtualenvs/env1/bin/python3.4 /home/ubuntu/.virtualenvs/env1/bin/celery --app=tp.celery:app worker --loglevel=INFO
ubuntu    9760  0.0  3.9 255840 39852 pts/6    S+   15:49   0:05 /home/ubuntu/.virtualenvs/env1/bin/python3.4 /home/ubuntu/.virtualenvs/env1/bin/celery --app=tp.celery:app worker --loglevel=INFO
ubuntu   12760  0.0  0.0  10464   932 pts/7    S+   19:04   0:00 grep --color=auto celery
Asked By: user1592380

||

Answers:

Try this in terminal

ps aux|grep 'celery worker'

You will see like this

username  29042  0.0  0.6  23216 14356 pts/1    S+   00:18   0:01 /bin/celery worker ...

Then kill process id by

sudo kill -9 process_id # here 29042

If you have multiple processes, then you have to kill all process id using above kill commmand

sudo kill -9 id1 id2 id3 ...

From the celery doc

ps auxww | grep 'celery worker' | awk '{print $2}' | xargs kill -9

OR if you are running celeryd

ps auxww | grep celeryd | awk '{print $2}' | xargs kill -9

Note

If you are running celery in supervisor, even though kill the process, it automatically restarts(if autorestart=True in supervisor script).

Answered By: itzMEonTV
pkill -f "celery worker"

easy to kill process by string patterns

Answered By: alan_wang

If the celery worker is running on a machine you do not have access to, you can use Celery "remote control" to control workers through messages sent via the broker.

celery control shutdown

This will kill all workers immediately. Depending on your setup, you might have to use -A myProject, like with Django.

Documentation here.

Answered By: Paulo Peres Junior
ps auxww | grep 'celery worker' | grep -v " grep " | awk '{print $2}' | xargs kill -9

this one is very similar to one presented before but improved because avoid the error that shows when attempt to kill the grep process..

Answered By: walter

In case someone’s looking to shutdown their celery app programmatically, the same thing can be done in python with:
celery_app.control.shutdown()

Answered By: D4nt3
Categories: questions Tags: , , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.