I'm using MyIPHide. I downloaded their client software, installed it and have the service turned on.
I can access https
websites fine with a browser but I cannot use requests
to get the pages
This works:
import requests
IP=requests.get('http://api.ipify.org').text
proxyDict = { "http" : IP,
"https" : IP
}
url='http://www.cnn.com'
r=requests.get(url,proxies=proxyDict)
This doesn't:
url='https://www.cnn.com'
r=requests.get(url,proxies=proxyDict)
only difference is http
vs https
here is the traceback:
File "C:\Python27\lib\site-packages\requests\adapters.py", line 502, in send
raise ProxyError(e, request=request)
ProxyError: HTTPSConnectionPool(host='www.cnn.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', error(10053, 'An established connection was aborted by the software in your host machine')))
I've tried other https
websites, they all don't work.
I have also emailed support at MyIPHide. They said all proxies support https
, which is true when I use a browser only.
One work around that works is if I use Selenium
and get the page, then use driver.page_source
for text.
It's not a proxy server problem because I have bought a private proxy server address through sslprivateproxy.com and put in the IP and port and I still get the same errors.
I'm using Python 2.7.15 and requests 2.20.1. Non proxy use of requests works, ie:
import requests
url='https://www.cnn.com'
r=requests.get(url)
>>> r
<Response [200]>
>>>
Also tried python 3.6 with requests 2.20.1 --> same results.