Simplest async/await example possible in Python

Question:

I’ve read many examples, blog posts, questions/answers about asyncio / async / await in Python 3.5+, many were complex, the simplest I found was probably this one.
Still it uses ensure_future, and for learning purposes about asynchronous programming in Python, I would like to see an even more minimal example, and what are the minimal tools necessary to do a basic async / await example.

Question: is it possible to give a simple example showing how async / await works, by using only these two keywords + code to run the async loop + other Python code but no other asyncio functions?

Example: something like this:

import asyncio

async def async_foo():
    print("async_foo started")
    await asyncio.sleep(5)
    print("async_foo done")

async def main():
    asyncio.ensure_future(async_foo())  # fire and forget async_foo()
    print('Do some actions 1')
    await asyncio.sleep(5)
    print('Do some actions 2')

loop = asyncio.get_event_loop()
loop.run_until_complete(main())

but without ensure_future, and still demonstrates how await / async works.

Asked By: Basj

||

Answers:

is it possible to give a simple example showing how async / await
works, by using only these two keywords + asyncio.get_event_loop() +
run_until_complete + other Python code but no other asyncio functions?

This way it’s possible to write code that works:

import asyncio


async def main():
    print('done!')


if __name__ ==  '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

But this way it’s impossible to demonstrate why you need asyncio.

By the way, why do you need asyncio, not just plain code? The answer is: asyncio allows you to get performance benefits when you parallelize I/O blocking operations (like reading/writing to the network). And to write a useful example, you need to use an async implementation of those operations.

Please read this answer for more detailed explanation.

Update:

OK, here’s example that uses asyncio.sleep to imitate an I/O blocking operation and asyncio.gather that shows how you can run multiple blocking operations concurrently:

import asyncio


async def io_related(name):
    print(f'{name} started')
    await asyncio.sleep(1)
    print(f'{name} finished')


async def main():
    await asyncio.gather(
        io_related('first'),
        io_related('second'),
    )  # 1s + 1s = over 1s


if __name__ ==  '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(main())

Output:

first started
second started
first finished
second finished
[Finished in 1.2s]

Note how both io_related started then, after only one second, both done.

Answered By: Mikhail Gerasimov

To answer your questions, I will provide three different solutions to the same problem.

Case 1: just normal Python

import time

def sleep():
    print(f'Time: {time.time() - start:.2f}')
    time.sleep(1)

def sum(name, numbers):
    total = 0
    for number in numbers:
        print(f'Task {name}: Computing {total}+{number}')
        sleep()
        total += number
    print(f'Task {name}: Sum = {total}n')

start = time.time()
tasks = [
    sum("A", [1, 2]),
    sum("B", [1, 2, 3]),
]
end = time.time()
print(f'Time: {end-start:.2f} sec')

Output:

Task A: Computing 0+1
Time: 0.00
Task A: Computing 1+2
Time: 1.00
Task A: Sum = 3

Task B: Computing 0+1
Time: 2.01
Task B: Computing 1+2
Time: 3.01
Task B: Computing 3+3
Time: 4.01
Task B: Sum = 6

Time: 5.02 sec

Case 2: async/await done wrong

import asyncio
import time

async def sleep():
    print(f'Time: {time.time() - start:.2f}')
    time.sleep(1)

async def sum(name, numbers):
    total = 0
    for number in numbers:
        print(f'Task {name}: Computing {total}+{number}')
        await sleep()
        total += number
    print(f'Task {name}: Sum = {total}n')

start = time.time()

loop = asyncio.get_event_loop()
tasks = [
    loop.create_task(sum("A", [1, 2])),
    loop.create_task(sum("B", [1, 2, 3])),
]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()

end = time.time()
print(f'Time: {end-start:.2f} sec')

Output:

Task A: Computing 0+1
Time: 0.00
Task A: Computing 1+2
Time: 1.00
Task A: Sum = 3

Task B: Computing 0+1
Time: 2.01
Task B: Computing 1+2
Time: 3.01
Task B: Computing 3+3
Time: 4.01
Task B: Sum = 6

Time: 5.01 sec

Case 3: async/await done right

The same as case 2, except the sleep function:

async def sleep():
    print(f'Time: {time.time() - start:.2f}')
    await asyncio.sleep(1)

Output:

Task A: Computing 0+1
Time: 0.00
Task B: Computing 0+1
Time: 0.00
Task A: Computing 1+2
Time: 1.00
Task B: Computing 1+2
Time: 1.00
Task A: Sum = 3

Task B: Computing 3+3
Time: 2.00
Task B: Sum = 6

Time: 3.01 sec

Case 1 and case 2 give the same 5 seconds, whereas case 3 just 3 seconds. So the async/await done right is faster.

The reason for the difference is within the implementation of the sleep function.

# Case 1
def sleep():
    ...
    time.sleep(1)

# Case 2
async def sleep():
    ...
    time.sleep(1)

# Case 3
async def sleep():
    ...
    await asyncio.sleep(1)

In case 1 and case 2, they are the "same":
they "sleep" without allowing others to use the resources.
Whereas in case 3, it allows access to the resources when it is asleep.

In case 2, we added async to the normal function. However the event loop will run it without interruption.
Why? Because we didn’t say where the loop is allowed to interrupt your function to run another task.

In case 3, we told the event loop exactly where to interrupt the function to run another task. Where exactly? Right here!

await asyncio.sleep(1)

For more on this, read here.

Consider reading

Answered By: Levon

Python 3.7+ now has a simpler API (in my opinion) with a simpler wording (easier to remember than "ensure_future"): you can use create_task which returns a Task object (that can be useful later to cancel the task if needed).

Basic example 1

import asyncio

async def hello(i):
    print(f"hello {i} started")
    await asyncio.sleep(4)
    print(f"hello {i} done")

async def main():
    task1 = asyncio.create_task(hello(1))  # returns immediately, the task is created
    await asyncio.sleep(3)
    task2 = asyncio.create_task(hello(2))
    await task1
    await task2

asyncio.run(main())  # main loop

Result:

hello 1 started
hello 2 started
hello 1 done
hello 2 done


Basic example 2

If you need to get the return value of these async functions, then gather is useful. The following example is inspired from the documentation.

import asyncio

async def factorial(n):
    f = 1
    for i in range(2, n + 1):
        print(f"Computing factorial({n}), currently i={i}...")
        await asyncio.sleep(1)
        f *= i
    return f

async def main():
    L = await asyncio.gather(factorial(2), factorial(3), factorial(4))
    print(L)  # [2, 6, 24]

asyncio.run(main())

Expected output:

Computing factorial(2), currently i=2…
Computing factorial(3), currently i=2…
Computing factorial(4), currently i=2…
Computing factorial(3), currently i=3…
Computing factorial(4), currently i=3…
Computing factorial(4), currently i=4…
[2, 6, 24]


PS: even if you use asyncio, and not trio, the tutorial of the latter was helpful for me to grok Python asynchronous programming.

Answered By: Basj

Since everything is nicely explained, then let’s run some examples with event loops compare synchronous code to asynchronous code.

synchronous code:

import time

def count():
    time.sleep(1)
    print('1')
    time.sleep(1)
    print('2')
    time.sleep(1)
    print('3')

def main():
    for i in range(3):
        count()

if __name__ == "__main__":
    t = time.perf_counter()
    main()
    t2 = time.perf_counter()
    
    print(f'Total time elapsed: {t2:0.2f} seconds')

output:

1
2
3
1
2
3
1
2
3
Total time elapsed: 9.00 seconds

We can see that each cycle of count running to completion before the next cycle begins.

asynchronous code:

import asyncio
import time

async def count():
    await asyncio.sleep(1)
    print('1')
    await asyncio.sleep(1)
    print('2')
    await asyncio.sleep(1)
    print('3')

async def main():
    await asyncio.gather(count(), count(), count())

if __name__ == "__main__":
    t = time.perf_counter()
    asyncio.run(main())
    t2 = time.perf_counter()

    print(f'Total time elapsed: {t2:0.2f} seconds')

output:

1
1
1
2
2
2
3
3
3
Total time elapsed: 3.00 seconds

The asynshonous equivalent on the other hand looks somting like this took three seconds to run as opposed to nine secounds.
The first count cycle was started and as soon as it hit the awaits sleep one Python was free to do other work, for instance starting the secound and subsequently the third count cycles.
This is why we have all the ones than all tubes then all three.
In the output programing concurrently can be a very valuable tool.
Multiprocessing has the operating do all of the multitasking work and in Python it’s the only option for multi-core concurrency that is having your program executed on multiple cores of CPU.
If use threads then the operating system is still doing all of the multitasking work and in cpython the global intrepeter lock prevents multi-core concurrency in asynshonous programming.
There is no operating system intervention there’s one process there’s one thread so what’s going on well tasks can release the CPU when there are waiting periods, so that other task can use it.

import asyncio

loop = asyncio.get_event_loop()


async def greeter(name):
    print(f"Hi, {name} you're in a coroutine.")

try:
    print('starting coroutine')
    coro = greeter('LP')
    print('entering event loop')
    loop.run_until_complete(coro)
finally:
    print('closing event loop')
    loop.close()

output:

starting coroutine
entering event loop
Hi, LP you're in a coroutine.
closing event loop

Asynchronous frameworks need a scheduler usually called an event loop. This event loop keeps track of all the running tasks and when a function suspended it returns control to the event loop which then will find another function to start or resume and this is called cooperative multitasking. Async IO provides a framework an asynchronous framework that’s centered on this event loop and it efficiently handles input/output events an application interacts with the event loop explicitly it registers code to be run and then it lets the event loop the scheduler make the necessary calls into application code when the resources are available.
So, if a network server open sockets and then registers them to be told when input events occur on them the event loop will alert the server code when there’s a new incoming connection or when there’s data to be read.
If there’s no more data to be read from a socket than the server then yields control back to the event loop.

The mechanism from yielding control back to the event loop depends on co-routines co-routines are a language construct designed for concurrent operation. The co-routine can pause execution using the awake keyword with another co-routine and while it’s paused the co-routine state is maintained allowing it to resume where it left off one co-routine can start another and then wait for the results and this makes it easier to decompose a task into reusable parts.

import asyncio

loop = asyncio.get_event_loop()

async def outer():
    print('in outer')
    print('waiting for result 1')
    result1 = await phase1()
    print('waiting for result 2')
    result2 = await phase2(result1)
    return result1, result2


async def phase1():
    print('in phase1')
    return 'phase1 result'

async def phase2(arg):
    print('in phase2')
    return 'result2 derived from {}'.format(arg)

asyncio.run(outer())

output:

in outer
waiting for result 1
in phase1
waiting for result 2
in phase2

This example asks two phases that must be executed in order but that can run concurrently with other operations. The awake keyword is used instead of addingbthe new co-routines to the loop because control flow is already inside of a co-routine being managed by the loop. It isn’t necessary to tell the loop to manage the new co-routines.

Answered By: Milovan Tomašević
import asyncio
import requests

async def fetch_users():
    response = requests.get('https://www.testjsonapi.com/users/')
    users = response.json()
    return users

async def print_users():
    # create an asynchronous task to run concurrently 
    # which wont block executing print statement before it finishes
    response = asyncio.create_task(fetch_users())
    print("Fetching users ")
    # wait to get users data from response before printing users
    users = await response

    for user in users:
        print(f"name : {user['name']} email : {user['email']}")

asyncio.run(print_users())
print("All users printed in console")

output will look like this

Fetching users
name : Harjas Malhotra email : [email protected]
name : Alisha Paul email : [email protected]
name : Mart Right email : [email protected]
name : Brad Pitter email : [email protected]
name : Ervin Dugg email : [email protected] 
name : Graham Bell email : [email protected]
name : James Rush email : [email protected]
name : Deepak Dev email : [email protected]
name : Ajay Rich email : [email protected]
All users printed in console

Let’s see how the code is working. Firstly when python will call print_users() it wont let print statement below it to be executed until it finishes. So, after going inside print_users() a concurrent task will be created so that statements below it can run simultaneously with that task which is fetch_users() here. when this task will run in that time Fetching users will be printed in console. After that python will wait for response from fetch_users() because users shouldn’t be printed before receiving. after completion of fetch_users() all users name and email will be printed in console. Thus, after completion of print_users() print statement below it will be executed.

Answered By: Ruman

I don’t know why but all of explanations on this topic are too complex or they are using examples with useless asyncio.sleep()…
So far the best code sample that I found is this: https://codeflex.co/python3-async-await-example/

Answered By: JavaGoPro

Everyone seems to focused on switching time.sleep to asyncio.sleep, but in the real world, that always isn’t possible. Sometimes you need to do a library call which possible does a API call (eg: requesting a signed URL from google).

Here’s how you can still using time.sleep, but in a async way:

import asyncio
import time
from concurrent.futures.thread import ThreadPoolExecutor

def sleep():
    print(f'Time: {time.time() - start:.2f}')
    time.sleep(1)

async def sum(name, numbers):
    _executor = ThreadPoolExecutor(2)
    total = 0
    for number in numbers:
        print(f'Task {name}: Computing {total}+{number}')
        await loop.run_in_executor(_executor, sleep)
        total += number
    print(f'Task {name}: Sum = {total}n')

start = time.time()

loop = asyncio.get_event_loop()
tasks = [
    loop.create_task(sum("A", [1, 2])),
    loop.create_task(sum("B", [1, 2, 3])),
]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()

end = time.time()
print(f'Time: {end-start:.2f} sec')

Output:

Task A: Computing 0+1
Time: 0.00
Task B: Computing 0+1
Time: 0.00
Task A: Computing 1+2
Time: 1.00
Task B: Computing 1+2
Time: 1.00
Task A: Sum = 3

Task B: Computing 3+3
Time: 2.01
Task B: Sum = 6

Time: 3.01 sec
Answered By: SynackSA

Simple..Sweet..Awesome.. ✅

  import asyncio
  import time
  import random

  async def eat():
     wait = random.randint(0,3)
     await asyncio.sleep(wait)
     print("Done With Eating")

  async def sleep():
     wait = random.randint(0,3)
     await asyncio.sleep(wait)
     print("Done With Sleeping")

  async def repeat():
     wait = random.randint(0,3)
     await asyncio.sleep(wait)
     print("Done With Repeating")

  async def main():
     for x in range(5):
        await asyncio.gather(eat(),sleep(),repeat())
        time.sleep(2)
        print("+","-"*20)

  if __name__ == "__main__":
     t = time.perf_counter()
     asyncio.run(main())
     t2 = time.perf_counter()

     print(f'Total time elapsed: {t2:0.2f} seconds')
Answered By: Rahul kuchhadia

Even though Some answers on the top were a little abstract

from datetime import datetime
import asyncio




async def time_taking(max_val,task_no):
    print("**TASK STARTING TO EXECUTE CONCURRENT TASk NO {} ***".format(task_no))

    await asyncio.sleep(2)
    value_list = []
    for i in range(0,max_val):
        value_list.append(i)

    print("****FINSIHING UP TASk NO {}  **".format(task_no))
    return value_list



async def test2(task_no):
    await asyncio.sleep(5)
    print("**TASK STARTING TO EXECUTE CONCURRENT TASk NO {} ***".format(task_no))
    await asyncio.sleep(5)
    print("****FINSIHING UP  TASk NO {}  **".format(task_no))

async def function(value = None):
    tasks = []
    start_time = datetime.now()
    
    # CONCURRENT TASKS
    tasks.append(asyncio.create_task(time_taking(20,1)))
    tasks.append(asyncio.create_task(time_taking(43,2)))
    tasks.append(asyncio.create_task(test2(3)))
    
    # concurrent execution
    lists = await asyncio.gather(*tasks)
    end_time = datetime.now()
    
    time_taken = end_time - start_time
    return lists,time_taken


# run inside event loop 
res,time_taken = asyncio.run(function())

print(res,time_taken)
Answered By: Atif Shafi

A very simple and smooth example here:

import asyncio

async def my_task1():
    print("Task 1 started")
    await asyncio.sleep(1)  # some light io task
    print("Task 1 completed")
    return "Done1"

async def my_task2():
    print("Task 2 started")
    await asyncio.sleep(2)  # some heavy io task
    print("Task 2 completed")
    return "Done2"

async def main():
    # both the functions are independent of each other, 
    # as tasks gets completes, `.gather` keeps on storing the results
    results = await asyncio.gather(my_task1(), my_task2())
    print(f"The results are {results}")
    
    # if task1 is dependent on completion of task2, then use this
    ret1 = await my_task2()
    ret2 = await my_task1()
    print(f"The ret1: {ret1} ret2 {ret2}")

asyncio.run(main())
Answered By: Deepanshu Mehta