requests: how to disable / bypass proxy

Question:

I am getting an url with:

r = requests.get("http://myserver.com")

As I can see in the ‘access.log’ of “myserver.com”, the client’s system proxy is used. But I want to disable using proxies at all with requests.

Asked By: t777

||

Answers:

The only way I’m currently aware of for disabling proxies entirely is the following:

  • Create a session
  • Set session.trust_env to False
  • Create your request using that session
import requests

session = requests.Session()
session.trust_env = False

response = session.get('http://www.stackoverflow.com')

This is based on this comment by Lukasa and the (limited) documentation for requests.Session.trust_env.

Note: Setting trust_env to False also ignores the following:

  • Authentication information from .netrc (code)
  • CA bundles defined in REQUESTS_CA_BUNDLE or CURL_CA_BUNDLE (code)

If however you only want to disable proxies for a particular domain (like localhost), you can use the NO_PROXY environment variable:

import os
import requests

os.environ['NO_PROXY'] = 'stackoverflow.com'

response = requests.get('http://www.stackoverflow.com')
Answered By: Lukas Graf

requests library respects environment variables.
http://docs.python-requests.org/en/latest/user/advanced/#proxies

So try deleting environment variables HTTP_PROXY and HTTPS_PROXY.

import os
for k in list(os.environ.keys()):
    if k.lower().endswith('_proxy'):
        del os.environ[k]
Answered By: KostasT

You can choose proxies for each request. From the docs:

import requests

proxies = {
  "http": "http://10.10.1.10:3128",
  "https": "http://10.10.1.10:1080",
}

requests.get("http://example.org", proxies=proxies)

So to disable the proxy, just set each one to None:

import requests

proxies = {
  "http": None,
  "https": None,
}

requests.get("http://example.org", proxies=proxies)
Answered By: jtpereyda

The way to stop requests/urllib from proxying any requests is to set the the no_proxy (or NO_PROXY) environment variable to * e.g. in bash:

export no_proxy='*'

Or from Python:

import os
os.environ['no_proxy'] = '*' 

To understand why this works is because the urllib.request.getproxies function first checks for any proxies set in the environment variables (e.g. http_proxy, HTTP_PROXY, https_proxy, HTTPS_PROXY, etc) or if none are set then it will check for system configured proxies using platform specific calls (e.g. On MacOS it will check using the system scutil/configd interfaces, and on Windows it will check the Registry). As mentioned in the comments if any proxy variables are set you can reset them as @udani suggested, or unset them like this from Python:

del os.environ['HTTP_PROXY']

Then when urllib attempts to use any proxies the proxyHandler function it will check for the presence and setting of the no_proxy environment variable – which can either be set to specific hostnames as mentioned above or it can be set the special * value whereby all hosts bypass the proxy.

Answered By: Pierz

With Python3, jtpereyda’s solution didn’t work, but the following did:

proxies = {
    "http": "",
    "https": "",
}
Answered By: Oriol Vilaseca
 r = requests.post('https://localhost:44336/api/',data='',verify=False)

I faced the same issue when connecting with localhost to access my .net backend from a Python script with the request module.

I set verify to False, which cancels the default SSL verification.

P.s – above code will throw a warning that can be neglected by below one

import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
r=requests.post('https://localhost:44336/api/',data='',verify=False)
Answered By: ritwik chakraborty

I implemented @jtpereyda‘s solution in our production codebase which worked fine on normal successful HTTP requests (200 OK), but this code ended up not working when receiving an HTTP redirect (301 Moved Permamently). Instead use:

requests.get("https://pypi.org/pypi/pillow/9.0.0/json", proxies={"http": "", "https": ""})

For comparison, this line causes a requests.exception.SSLError when behind a proxy (pypi.org tries to redirect us to Pillow with an uppercase P):

requests.get("https://pypi.org/pypi/pillow/9.0.0/json", proxies={"http": None, "https": None})
Answered By: xjcl

For those for which no_proxy="*" doesnt work, try 0.0.0.0/32, that worked for me.

Answered By: Henning Hasemann
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.