How to speed up API requests?

Question:

I’ve constructed the following little program for getting phone numbers using google’s place api but it’s pretty slow. When I’m testing with 6 items it takes anywhere from 4.86s to 1.99s and I’m not sure why the significant change in time. I’m very new to API’s so I’m not even sure what sort of things can/cannot be sped up, which sort of things are left to the webserver servicing the API and what I can change myself.

import requests,json,time
searchTerms = input("input places separated by comma")

start_time = time.time() #timer
searchTerms = searchTerms.split(',')
for i in searchTerms:
    r1 = requests.get('https://maps.googleapis.com/maps/api/place/textsearch/json?query='+ i +'&key=MY_KEY')
    a = r1.json()
    pid = a['results'][0]['place_id']
    r2 = requests.get('https://maps.googleapis.com/maps/api/place/details/json?placeid='+pid+'&key=MY_KEY')
    b = r2.json()
    phone = b['result']['formatted_phone_number']
    name = b['result']['name']
    website = b['result']['website']
    print(phone+' '+name+' '+website)

print("--- %s seconds ---" % (time.time() - start_time))
Asked By: click here

||

Answers:

Most of the time isn’t spent computing your request. The time is spent in communication with the server. That is a thing you cannot control.

However, you may be able to speed it along using parallelization. Create a separate thread for each request as a start.

from threading import Thread

def request_search_terms(*args):
    #your logic for a request goes here
    pass

#...

threads = []
for st in searchTerms:
    threads.append (Thread (target=request_search_terms, args=(st,)))
    threads[-1].start()

for t in threads:
    t.join();

Then use a thread pool as the number of request grows, this will avoid the overhead of repeated thread creation.

its a matter of latency between client and servers , you can’t change anything in this way unless you use multiple server location ( the near server to the client are getting the request ) .

in term of performance you can build a multithreding system that can handel multiple requests at once .

Answered By: Mhamed Mansouri

You may want to send requests in parallel. Python provides multiprocessing module which is suitable for task like this.

Sample code:

from multiprocessing import Pool

def get_data(i):
    r1 = requests.get('https://maps.googleapis.com/maps/api/place/textsearch/json?query='+ i +'&key=MY_KEY')
    a = r1.json()
    pid = a['results'][0]['place_id']
    r2 = requests.get('https://maps.googleapis.com/maps/api/place/details/json?placeid='+pid+'&key=MY_KEY')
    b = r2.json()
    phone = b['result']['formatted_phone_number']
    name = b['result']['name']
    website = b['result']['website']
    return ' '.join((phone, name, website))

if __name__ == '__main__':
    terms = input("input places separated by comma").split(",")
    with Pool(5) as p:
        print(p.map(get_data, terms))
Answered By: Ɓukasz Rogalski

There is no need to do multithreading yourself. grequests provides a quick drop-in replacement for requests.

Answered By: qwr

Use sessions to enable persistent HTTP connections (so you don’t have to establish a new connection every time)

Docs: Requests Advanced Usage – Session Objects

Answered By: Joe Heffer
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.