Equivalent of asyncio.Queues with worker "threads"

Question:

I’m trying to figure out how to port a threaded program to use asyncio. I have a lot of code which synchronizes around a few standard library Queues, basically like this:

import queue, random, threading, time

q = queue.Queue()

def produce():
    while True:
        time.sleep(0.5 + random.random())  # sleep for .5 - 1.5 seconds
        q.put(random.random())

def consume():
    while True: 
        value = q.get(block=True)
        print("Consumed", value)

threading.Thread(target=produce).start()
threading.Thread(target=consume).start()

One thread creates values (possibly user input), and another thread does something with them. The point is that these threads are idle until there’s new data, at which point they wake up and do something with it.

I’m trying to implement this pattern using asyncio, but I can’t seem to figure out how to make it “go”.

My attempts look more or less like this (and don’t do anything at all).

import asyncio, random

q = asyncio.Queue()

@asyncio.coroutine
def produce():
    while True: 
        q.put(random.random())
        yield from asyncio.sleep(0.5 + random.random())

@asyncio.coroutine
def consume():
    while True:
        value = yield from q.get()
        print("Consumed", value)

# do something here to start the coroutines. asyncio.Task()? 

loop = asyncio.get_event_loop()
loop.run_forever()

I’ve tried variations on using coroutines, not using them, wrapping stuff in Tasks, trying to make them create or return futures, etc.

I’m starting to think that I have the wrong idea about how I should be using asyncio (maybe this pattern should be implemented in a different way that I’m not aware of).
Any pointers would be appreciated.

Asked By: Seth

||

Answers:

Yes, exactly. Tasks are your friends:

import asyncio, random

q = asyncio.Queue()

@asyncio.coroutine
def produce():
    while True:
        yield from q.put(random.random())
        yield from asyncio.sleep(0.5 + random.random())

@asyncio.coroutine
def consume():
    while True:
        value = yield from q.get()
        print("Consumed", value)


loop = asyncio.get_event_loop()
loop.create_task(produce())
loop.create_task(consume())
loop.run_forever()

asyncio.ensure_future can be used for task creation also.

And please keep in mind: q.put() is a coroutine, so you should to use yield from q.put(value).

UPD

Switched from asyncio.Task()/asyncio.async() to new brand API loop.create_task() and asyncio.ensure_future() in example.

Answered By: Andrew Svetlov

Here’s what I use in production, moved to gist: https://gist.github.com/thehesiod/7081ab165b9a0d4de2e07d321cc2391d

Answered By: amohr

A bit later and maybe OT, have in mind that you can consume from the Queue from multiple tasks as they were independent consumers.

The following snippet shows as an example how you can achieve the same thread pool pattern with asyncio tasks.

q = asyncio.Queue()

async def sum(x):
    await asyncio.sleep(0.1)  # simulates asynchronously
    return x

async def consumer(i):
    print("Consumer {} started".format(i))
    while True:
        f, x = await q.get()
        print("Consumer {} procesing {}".format(i, x))
        r = await sum(x)
        f.set_result(r)

async def producer():
    consumers = [asyncio.ensure_future(consumer(i)) for i in range(5)]
    loop = asyncio.get_event_loop()
    tasks = [(asyncio.Future(), x) for x in range(10)]
    for task in tasks:
        await q.put(task)

    # wait until all futures are completed
    results = await asyncio.gather(*[f for f, _ in tasks])
    assert results == [r for _, r in tasks]

    # destroy tasks
    for c in consumers:
        c.cancel()


asyncio.get_event_loop().run_until_complete(producer())
Answered By: pfreixes