python-rq worker not reading jobs in queue

Question:

I’m facing a basic issue while setting up python-rq – the rqworker doesn’t seem to recognize jobs that are pushed to the queue it’s listening on.

Everything is run inside virtualenv
I have the following code:

from redis import Redis
from rq import Queue
from rq.registry import FinishedJobRegistry
from videogen import videogen
import time

redis_conn = Redis(port=5001)
videoq = Queue('medium', connection=redis_conn)
fin_registry = FinishedJobRegistry(connection=redis_conn, name='medium')

jobid = 1024
job = videoq.enqueue(videogen, jobid)

while not job.is_finished:
    time.sleep(2)
    print job.result

Here videogen is a simple function which immediately returns the integer parameter it receives.

On running rqworker medium and starting the app, there is no result printed. There are NO extra traces at rqworker other than this:

14:41:29 RQ worker started, version 0.5.0
14:41:29 
14:41:29 *** Listening on medium...

The redis instance is accessible from the same shell where I run rqworker, as even shows the updated keys:

127.0.0.1:5001> keys *
1) "rq:queues"
2) "rq:queue:medium"
3) "rq:job:9a46f9c5-03e1-4b08-946b-61ad2c3815b1"

So what is possibly missing here?

Asked By: eternalthinker

||

Answers:

Silly error – had to supply redis connection url to rqworker
rqworker --url redis://localhost:5001 medium

Answered By: eternalthinker

It’s worth noting that this can also happen if you run your RQ workers on Windows, which is not supported by the workers. From the documentation:

RQ workers will only run on systems that implement fork(). Most
notably, this means it is not possible to run the workers on Windows
without using the Windows Subsystem for Linux and running in a bash
shell.

Answered By: Steven
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.