How to integrate RabbitMQ RPC into FastApi properly
Question:
I am improving my FastAPI project. One of the methods needs to run a heavy computational task on another machine. Due to the high load this should be done in a queue way. I am following RabbitMQ RPC guide to perform remote procedure call via message queue.
This guide suggests to create exclusive queue for each client-server session, which means if I will create new instance for each method call then for N calls there will be N queues created, which is obviously inefficient.
So, my questions are:
- Is there a way in FastAPI to create fixed pull of workers and give each worker only one unique instance of RPCClient? Or, maybe, create a fixed pull from those clients and give each new worker one client from this pull?
- How badly will be the slow if I use strait-forward solution mentioned above?
- Is there efficient way to return back results for several workers via single response queue?
Answers:
I’ve managed to solve all those problems by simply using Celery with RabbitMQ as backend
I am improving my FastAPI project. One of the methods needs to run a heavy computational task on another machine. Due to the high load this should be done in a queue way. I am following RabbitMQ RPC guide to perform remote procedure call via message queue.
This guide suggests to create exclusive queue for each client-server session, which means if I will create new instance for each method call then for N calls there will be N queues created, which is obviously inefficient.
So, my questions are:
- Is there a way in FastAPI to create fixed pull of workers and give each worker only one unique instance of RPCClient? Or, maybe, create a fixed pull from those clients and give each new worker one client from this pull?
- How badly will be the slow if I use strait-forward solution mentioned above?
- Is there efficient way to return back results for several workers via single response queue?
I’ve managed to solve all those problems by simply using Celery with RabbitMQ as backend