0

I was setting up a proxy yesterday on Ubuntu for a test as well as installed some mono libs including new certs for the Linux build of fiddler. After turning off all my proxy settings (such as going to the network settings and turning off the proxy "System wide") I can hit the network normally through all the tools except python. The browsers works fine, I can ping endpoints, I can use wget and curl without any issue, but python's network settings seem screwed up.

Here's the output when I run pip install -U pyopenssl:

$ pip install -U pyopenssl
Collecting pyopenssl
  Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f89bd58dfd0>: Failed to establish a new connection: [Errno 111] Connection refused',))': /simple/pyopenssl/
  Retrying (Retry(total=3, connect=None, read=None, redirect=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f89bd58db10>: Failed to establish a new connection: [Errno 111] Connection refused',))': /simple/pyopenssl/
  ...
  Retrying (Retry(total=0, connect=None, read=None, redirect=None)) after connection broken by 'ProxyError('Cannot connect to proxy.', NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f89bd595f90>: Failed to establish a new connection: [Errno 111] Connection refused',))': /simple/pyopenssl/
  Could not find a version that satisfies the requirement pyopenssl (from versions: )
No matching distribution found for pyopenssl

Running a simple test urllib2 also fails (urllib2.URLError: ):

import urllib2

def internet_on():
    try:
        urllib2.urlopen('http://216.58.192.142', timeout=1)
        print "it worked!"
        return True
    except urllib2.URLError as err: 
        print "it did not work"

        return False

internet_on()

The requests library also instantly fails in some tools I use:

requests.exceptions.ProxyError: HTTPConnectionPool(host='127.0.0.1', port=8888): Max retries exceeded with url: http://10.30.0.63/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4c9c327f90>: Failed to establish a new connection: [Errno 111] Connection refused',)))

I'm guessing python thinks I'm behind a proxy when I'm not anymore. Is there a way to debug or reset python's network configuration?

voodoogiant
  • 2,118
  • 6
  • 29
  • 49

2 Answers2

1

Fixed the issue. Based on heemayl's comment about an http_proxy variable I checked the environment variables in my bashrc and saw a script had probably added http_proxy and https_proxy, which urllib.getproxies() uses.

Removing those lines fixed the issue. Thanks, heemayl.

voodoogiant
  • 2,118
  • 6
  • 29
  • 49
0

Have you tried using the pip --trusted-host pypi.python.org and/or --proxy http://PROXYNAME.com?

I've had similar experiences with pip interacting with system proxies. I imagine pip looks to pypi, hits the proxy, and doesn't find a distribution matching (logically).

The full command would look something like:

pip install -U --trusted-host pypi.python.org --proxy http://PROXYNAME.com pyopenssl

Trusted host helps if your proxy interferes with the certificates.

Let me know if that helps! Also, check env | grep proxy just in case something is wonky with the way you are setting the proxy in the environment.

More generally, check out this question about handling proxies with urllib.