So I have a pretty basic python script that I'm trying to use to retrieve information from a website. I know this code is correct because it runs just fine on my home PC, but not on my company's PC.
import urllib2
response = urllib2.urlopen('https://www.website.com/')
Every time I run the script, it just hangs for a couple minutes, until it eventually throws the following error:
IOError: [Errno socket error] [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
After searching other posts, I found that similar problems were proxy related issues. I checked my PC's network settings to see if I could find a proxy address and found that my company set up a automatic configuration script. I followed the link and downloaded the script, and I think I found which proxy I should use. I edited my code according to another post on how to include a proxy. See below:
import urllib2
proxy = urllib2.ProxyHandler({'http': 'proxy1.companywebsite.com:80'})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)
response = urllib2.urlopen('https://www.website.com/')
When I run this, it still hangs and throws an error:
URLError: urlopen error [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>
If anybody has any ideas on why this isn't working, I'd greatly appreciate it. I've spent hours trying to figure this out.