11

Using this curl command I am able to get the response I am looking for from Bash

curl -v -u z:secret_key --proxy http://proxy.net:80  \
-H "Content-Type: application/json" https://service.com/data.json

I have already seen this other post on proxies with the Requests module

And it helped me formulate my code in Python but I need to make a request via a proxy. However, even while supplying the proper proxies it isn't working. Perhaps I'm just not seeing something?

>>> requests.request('GET', 'https://service.com/data.json', \
>>> headers={'Content-Type':'application/json'}, \ 
>>> proxies = {'http' : "http://proxy.net:80",'https':'http://proxy.net:80'}, \
>>> auth=('z', 'secret_key'))

Furthermore, at the same python console I can use urllib to make a request have it be successful.

>>> import urllib
>>> urllib.urlopen("http://www.httpbin.org").read()
---results---

Even trying requests on just a non-https address fails to work.

>>> requests.get('http://www.httpbin.org')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Library/Python/2.6/site-packages/requests/api.py", line 79, in get
   return request('get', url, **kwargs)
File "/Library/Python/2.6/site-packages/requests/api.py", line 66, in request
    prefetch=prefetch
File "/Library/Python/2.6/site-packages/requests/sessions.py", line 191, in request
    r.send(prefetch=prefetch)
File "/Library/Python/2.6/site-packages/requests/models.py", line 454, in send
    raise ConnectionError(e)
requests.exceptions.ConnectionError: Max retries exceeded for url:

Requests is so elegant and awesome but how could it be failing in this instance?

Community
  • 1
  • 1
dalanmiller
  • 3,467
  • 5
  • 31
  • 38
  • http://pycurl.sourceforge.net/ – Derek Litz Dec 13 '11 at 05:26
  • 1
    I know that I could probably setup and use pycurl on my Mac without too much trouble (or likely any at all). I was just trying to go for the more elegant solution of using Requests which is pretty awesome and clean. Thank you for the suggestion though. – dalanmiller Dec 14 '11 at 02:20
  • Setting up a proxy for use with requests works just fine here. Ideally we could reproduce what your seeing... otherwise telling us why it doesn't work is the only other option. Are you getting a stack trace from requests? You could also monitor your network and check the actual requests, since I can only guess they have to be different for a different effect to be observed between curl/requests. – Derek Litz Dec 14 '11 at 13:09
  • I'm now noticing that https requests of any kind using any library/module are not working within Python. However, doing just a normal http request works fine. Think it could be my environment variables? How would I check what is wrong? – dalanmiller Dec 14 '11 at 22:08
  • requests does https cert validation by default. perhaps it's failing to validate the cert for your proxy? – Chris AtLee Dec 31 '11 at 00:55

2 Answers2

9

The problem actually lies with python's standard url access libraries - urllib/urllib2/httplib. I can't remember which library is the exact culprit, but for simplicity's sake, let's just call it urllib. Unfortunately, urllib doesn't implement the HTTP Connect method which is required for accessing an https site through an http(s) proxy. My efforts to add the functionality using urllib have not been successful (it has been a while since I tried). So unfortunately the only option I know to work is to use pycurl for this case.

However, there is a solution which is relatively clean that is almost exactly the same API as python requests, but it uses a pycurl backend instead of the python standard libraries.

The library is called human_curl. I've used it myself and have had great results.

ravenac95
  • 3,557
  • 1
  • 20
  • 21
  • 1
    That is not correct. urllib2 **does** support HTTP connect (http://bugs.python.org/issue1424152) while request didn't support it till 2.0 (https://github.com/kennethreitz/requests/pull/1515). – schlamar Sep 25 '13 at 18:32
1

Believeing above answer we tried human_curl

human_curl gave errors like Unknown errors, whereas urllib3 gave correct errors like Request Timed out, Max retries exceeded with url.

So, we went back to urllib3, urllib3 is thread-safe. We are happy with urllib3

Only problem now we get it "Max retries exceeded", We cant solve it, Guessing it might be to do with server/proxy, But not sure.

  • 1
    I am using requests at work and everything seems to work fine, including communications over https connections. Further we use proxies for debugging http requests. If you can shed some light on your issue, I might be able to help you. – Ifthikhan Aug 06 '12 at 10:00