python asynchronous requests

Question:

So I have a list of image url’s that I want to iterate over with the requests library and download all the images to a directory.

def get_image(url, image_name):
    path = pathlib.Path('/path/to/some/directory')
    response = requests.get(url, stream=True)
    with open('{}/{}.png'.format(path, image_name), 'wb') as file:
        for block in response.iter_content(1024):
            file.write(block)

for url in urls:
    get_image(url, image_name)

Now, is there no way I could create a decorator to make a function a callback to run once a response is returned for a specific asynchronous request?

Asked By: Jasonca1

||

Answers:

If you want multiple concurrent requests + callbacks you can use module like grequests. It has nothing to do with asyncio.

asyncio – is all about to avoid using of callbacks (to avoid callback hell) and make writing of asynchronous code as easy as synchronous one.

If you decide to try asyncio you should either use aiohttp client instead of requests (this is preferred way) or run requests in thread pool managed by asyncio. Example of both ways can be found here.

Answered By: Mikhail Gerasimov

The recommended grequests now recommends:

Note: You should probably use requests-threads or requests-futures instead.

Example usage of requests-futures:

from requests_futures.sessions import FuturesSession

session = FuturesSession()
# first request is started in background
future_one = session.get('http://httpbin.org/get')
# second requests is started immediately
future_two = session.get('http://httpbin.org/get?foo=bar')
# wait for the first request to complete, if it hasn't already
response_one = future_one.result()
print('response one status: {0}'.format(response_one.status_code))
print(response_one.content)
# wait for the second request to complete, if it hasn't already
response_two = future_two.result()
print('response two status: {0}'.format(response_two.status_code))
print(response_two.content)
Answered By: serv-inc