Python 3 urllib with self-signed certificates
Question:
I’m attempting to download some data from an internal server using Python. Since it’s internal, it uses a self-signed certificate. (We don’t want to pay Verisign for servers that will never appear “in the wild.”) The Python 2.6 version of the code worked fine.
response = urllib2.urlopen(URL)
data = csv.reader(response)
I’m now trying to update to Python 3.4 (long story, don’t ask.) However, using Python 3’s urllib fails:
response = urllib.request.urlopen(URL)
It throws a CERTIFICATE_VERIFY_FAILED error.
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600)>
In reading around the web, apparently Python 2.6 urllib2 doesn’t bother to verify certificates. Some versions of urllib allow “verify=False” to be passed to the method signature, but that doesn’t appear to work in Python 3.4.
Does anyone know how I can get around this? I’d like to avoid using the Requests package because of corporate security guidelines.
Answers:
urllib.request.urlopen
has a context keyword parameter that accepts an SSLContext object. So, passing a SSLContext
object with .verify_mode set to ssl.CERT_NONE i.e. SSLContext.verify_mode = ssl.CERT_NONE
should be equal to verify=False
Use following for disabling SSL certificate validation for a URL
import ssl
myssl = ssl.create_default_context();
myssl.check_hostname=False
myssl.verify_mode=ssl.CERT_NONE
urlopen("URL",context=myssl)
Use following to disable SSL certificate validations for all URLs
ssl._create_default_https_context = ssl._create_unverified_context
urlopen("URL");
With urllib3 :
import urllib3
urllib3.disable_warnings()
http.request('GET', 'https://example.org')
As this answer still pops out first when I google for ‘CERTIFICATE_VERIFY_FAILED error’, I would like to provide an Update:
Fully turning off SSL verification is not a good practice in general (I’ll give a reason below).
It is better to add the self-signed certificate to the locally trusted certificates than to deactivate the verification completely:
import ssl
# add self_signed cert
myssl = ssl.create_default_context()
myssl.load_verify_locations('my_server_cert.pem')
# send request
response = urllib.request.urlopen("URL",context=myssl)
Why should we care about certificate validation?
NIST and other cyber-security organizations recommend the zero-trust model. In short, that means: just because your server is only in the local network, it is not protected. You should take a minimum effort to secure everything in the local network.
So if you have a web server with SSL and the security certificates are not validated by the client, any attacker could still impersonate this web server or steal information by starting a man-in-the-middle attack.
You already have the private SSL certificate, so why not add it on client side and enjoy the full protection of SSL?
I’m attempting to download some data from an internal server using Python. Since it’s internal, it uses a self-signed certificate. (We don’t want to pay Verisign for servers that will never appear “in the wild.”) The Python 2.6 version of the code worked fine.
response = urllib2.urlopen(URL)
data = csv.reader(response)
I’m now trying to update to Python 3.4 (long story, don’t ask.) However, using Python 3’s urllib fails:
response = urllib.request.urlopen(URL)
It throws a CERTIFICATE_VERIFY_FAILED error.
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600)>
In reading around the web, apparently Python 2.6 urllib2 doesn’t bother to verify certificates. Some versions of urllib allow “verify=False” to be passed to the method signature, but that doesn’t appear to work in Python 3.4.
Does anyone know how I can get around this? I’d like to avoid using the Requests package because of corporate security guidelines.
urllib.request.urlopen
has a context keyword parameter that accepts an SSLContext object. So, passing a SSLContext
object with .verify_mode set to ssl.CERT_NONE i.e. SSLContext.verify_mode = ssl.CERT_NONE
should be equal to verify=False
Use following for disabling SSL certificate validation for a URL
import ssl
myssl = ssl.create_default_context();
myssl.check_hostname=False
myssl.verify_mode=ssl.CERT_NONE
urlopen("URL",context=myssl)
Use following to disable SSL certificate validations for all URLs
ssl._create_default_https_context = ssl._create_unverified_context
urlopen("URL");
With urllib3 :
import urllib3
urllib3.disable_warnings()
http.request('GET', 'https://example.org')
As this answer still pops out first when I google for ‘CERTIFICATE_VERIFY_FAILED error’, I would like to provide an Update:
Fully turning off SSL verification is not a good practice in general (I’ll give a reason below).
It is better to add the self-signed certificate to the locally trusted certificates than to deactivate the verification completely:
import ssl
# add self_signed cert
myssl = ssl.create_default_context()
myssl.load_verify_locations('my_server_cert.pem')
# send request
response = urllib.request.urlopen("URL",context=myssl)
Why should we care about certificate validation?
NIST and other cyber-security organizations recommend the zero-trust model. In short, that means: just because your server is only in the local network, it is not protected. You should take a minimum effort to secure everything in the local network.
So if you have a web server with SSL and the security certificates are not validated by the client, any attacker could still impersonate this web server or steal information by starting a man-in-the-middle attack.
You already have the private SSL certificate, so why not add it on client side and enjoy the full protection of SSL?