Python: requests.exceptions.ConnectionError. Max retries exceeded with url

Question:

This is the script:

import requests
import json
import urlparse
from requests.adapters import HTTPAdapter

s = requests.Session()
s.mount('http://', HTTPAdapter(max_retries=1))

with open('proxies.txt') as proxies:
    for line in proxies:
        proxy=json.loads(line)

    with open('urls.txt') as urls:
        for line in urls:

            url=line.rstrip()
            data=requests.get(url, proxies=proxy)
            data1=data.content
            print data1
            print {'http': line}

as you can see, its trying to access a list of urls through a list of proxies. Here is the urls.txt file:

http://api.exip.org/?call=ip

here is the proxies.txt file:

{"http":"http://107.17.92.18:8080"}

I got this proxy at www.hidemyass.com. Could it be a bad proxy? I have tried several and this is the result. Note: if you are trying to replicate this, you may have to update the proxy to a recent one at hidemyass.com. They seem to stop working eventually.
here is the full error and traceback:

Traceback (most recent call last):
  File "test.py", line 17, in <module>
    data=requests.get(url, proxies=proxy)
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
    return request('get', url, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
    return session.request(method=method, url=url, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 335, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 454, in send
    history = [resp for resp in gen] if allow_redirects else []
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 144, in resolve_redirects
    allow_redirects=False,
  File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 438, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 327, in send
    raise ConnectionError(e)
requests.exceptions.ConnectionError: HTTPConnectionPool(host=u'219.231.143.96', port=18186): Max retries exceeded with url: http://www.google.com/ (Caused by <class 'httplib.BadStatusLine'>: '')
Asked By: BigBoy1337

||

Answers:

Looking at stack trace you’ve provided your error is caused by httplib.BadStatusLine exception, which, according to docs, is:

Raised if a server responds with a HTTP status code that we don’t understand.

In other words something that is returned (if returned at all) by proxy server cannot be parsed by httplib that does actual request.

From my experience with (writing) http proxies I can say that some implementations may not follow specs too strictly (rfc specs on http aren’t easy reading actually) or use hacks to fix old browsers that have flaws in their implementation.

So, answering this:

Could it be a bad proxy?

… I’d say – that this is possible. The only real way to be sure is to see what is returned by proxy server.

Try to debug it with debugger or grab packet sniffer (something like Wireshark or Network Monitor) to analyze what happens in the network. Having info about what exactly is returned by proxy server should give you a key to solve this issue.

Answered By: Eugene Loy

Maybe you are overloading the proxy server by sending too much requests in a short period of time, you say that you got the proxy from a popular free proxy website which means that you’re not the only one using that server and it’s often under heavy load.

If you add some delay between your requests like this :

from time import sleep

[...]

data=requests.get(url, proxies=proxy)
data1=data.content
print data1
print {'http': line}
sleep(1)

(note the sleep(1) which pauses the execution of the code for one second)

Does it work ?

Answered By: user2629998
def hello(self):
    self.s = requests.Session()
    self.s.headers.update({'User-Agent': self.user_agent})
    return True

Try this,It worked for me πŸ™‚

Answered By: Ashu

This happens when you send too many requests to the public IP address of https://anydomainname.example.com/. It as you can see caused due to some reason which does not allow/block access to the public IP address mapping with https://anydomainname.example.com/. One better solution is the following python script which calculates the public IP address of any domain and creates that mapping to the /etc/hosts file.

import re
import socket
import subprocess
from typing import Tuple

ENDPOINT = 'https://anydomainname.example.com/'

def get_public_ip() -> Tuple[str, str, str]:
    """
    Command to get public_ip address of host machine and endpoint domain
    Returns
    -------
    my_public_ip : str
        Ip address string of host machine.
    end_point_ip_address : str
        Ip address of endpoint domain host.
    end_point_domain : str
        domain name of endpoint.

    """
    # bash_command = """host myip.opendns.com resolver1.opendns.com | 
    #     grep "myip.opendns.com has" | awk '{print $4}'"""
    # bash_command = """curl ifconfig.co"""
    # bash_command = """curl ifconfig.me"""
    bash_command = """ curl icanhazip.com"""
    my_public_ip = subprocess.getoutput(bash_command)
    my_public_ip = re.compile("[0-9.]{4,}").findall(my_public_ip)[0]
    end_point_domain = (
        ENDPOINT.replace("https://", "")
        .replace("http://", "")
        .replace("/", "")
    )
    end_point_ip_address = socket.gethostbyname(end_point_domain)
    return my_public_ip, end_point_ip_address, end_point_domain


def set_etc_host(ip_address: str, domain: str) -> str:
    """
    A function to write mapping of ip_address and domain name in /etc/hosts.
    Ref: https://stackoverflow.com/questions/38302867/how-to-update-etc-hosts-file-in-docker-image-during-docker-build

    Parameters
    ----------
    ip_address : str
        IP address of the domain.
    domain : str
        domain name of endpoint.

    Returns
    -------
    str
        Message to identify success or failure of the operation.

    """
    bash_command = """echo "{}    {}" >> /etc/hosts""".format(ip_address, domain)
    output = subprocess.getoutput(bash_command)
    return output


if __name__ == "__main__":
    my_public_ip, end_point_ip_address, end_point_domain = get_public_ip()
    output = set_etc_host(ip_address=end_point_ip_address, domain=end_point_domain)
    print("My public IP address:", my_public_ip)
    print("ENDPOINT public IP address:", end_point_ip_address)
    print("ENDPOINT Domain Name:", end_point_domain )
    print("Command output:", output)

You can call the above script before running your desired function πŸ™‚

Answered By: Vaibhav Hiwase

This happens when you overload the server with multiple requests. In order to bypass this you can increase the time between each request. But the best thing in my case was to increase the retry times in each request

requests.adapters.DEFAULT_RETRIES = 5 # increase retries number
requests.get(url)

If this is still not helpful you can find more ways here.

Answered By: dkaradima
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.