How to continously wait on any of multiple concurrent tasks to complete?


Let’s say there are multiple sources of events I want to monitor and respond to in an orderly fashion – for instance multiple connected sockets.

What’s the best way to continuously await until any of them has data available to be read?

asyncio.wait seems promising, but I am unsure about how to make sure tasks for sockets, that were just read from, get re-added into the list of tasks to await on.

I tried to re-schedule all of the reads every time the loop ran, but that (obviously) didn’t work.

As a hack, I came up with cancelling pending tasks each iteration of the loop. The code I currently have, looks like this. But I’m not sure it’s actually correct in all cases.

while True:
    done, pending = await asyncio.wait([,], return_when=FIRST_COMPLETED)

    for received in done:

    for to_cancel in pending:

What would be the most elegant (and correct!) way of doing this?

Asked By: JanLikar



Just re-create a task for calling easch .read() method every time one of the sockets returns. By wrapping the co-routine in a task, you can associate arbitrary metadata to it (in the form of plain Python attributes), and then it is easy to track which task should be re-created:

async def worker():

    pending_sockets = [socket1, socket2]
    pending_tasks = []
    while True:
        for sock in pending_sockets:
            task = asyncio.create_task(
            task.source = sock

        done, pending_tasks = await asyncio.wait(pending_tasks, return_when=FIRST_COMPLETED)

        pending_sockets = []
        for received in done:
Answered By: jsbueno