What is the proper way to make downstream Https requests inside of Uvicorn/FastAPI?

Question:

I have an API endpoint (FastAPI / Uvicorn). Among other things, it makes a request to yet another API for information. When I load my API with multiple concurrent requests, I begin to receive the following error:

h11._util.LocalProtocolError: can't handle event type ConnectionClosed when role=SERVER and state=SEND_RESPONSE

In a normal environment, I would take advantage of request.session, but I understand it not to be fully thread safe.

Thus, what is the proper approach to using requests within a framework such as FastAPI, where multiple threads would be using the requests library at the same time?

Asked By: SteveJ

||

Answers:

Instead of using requests, you could use httpx, which offers an async API as well (httpx is also suggested in FastAPI’s documentation when performing async tests, as well as FastAPI/Starlette recently replaced the HTTP client on TestClient from requests to httpx).

The below example is based on the one given in httpx documentation, demonstrating how to use the library for making an asynchronous HTTP(s) request, and subsequently, streaming the response back to the client. The httpx.AsyncClient() is what you can use instead of requests.Session(), which is useful when several requests are being made to the same host, as the underlying TCP connection will be reused, instead of recreating one for every single request—hence, resulting in a significant performance improvement. Additionally, it allows you to reuse headers and other settings (such as proxies and timeout), as well as persist cookies, across requests. You spawn a Client and reuse it every time you need it. You can use await client.aclose() to explicitly close the client once you are done with it (you could do that inside a shutdown event handler, for instance). Examples and more details can also be found in this answer.

from fastapi import FastAPI
import httpx
from starlette.background import BackgroundTask
from fastapi.responses import StreamingResponse

client = httpx.AsyncClient()
app = FastAPI()

@app.on_event('shutdown')
async def shutdown_event():
    await client.aclose()

@app.get('/')
async def home():
    req = client.build_request('GET', 'https://www.example.com/')
    r = await client.send(req, stream=True)
    return StreamingResponse(r.aiter_text(), background=BackgroundTask(r.aclose))

Using the async API of httpx would mean that you have to define your endpoints with async def; otherwise, you would have to use the standard synchronous API (for def vs async def see this answer), and as described in this github discussion:

Yes. HTTPX is intended to be thread-safe, and yes, a single
client-instance across all threads will do better in terms of
connection pooling, than using an instance-per-thread.

You can also control the connection pool size using the limits keyword argument on the Client (see Pool limit configuration). For example:

limits = httpx.Limits(max_keepalive_connections=5, max_connections=10)
client = httpx.Client(limits=limits)
Answered By: Chris
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.