16

If I run:

urllib2.urlopen('http://google.com')

even if I use another url, I get the same error.

I'm pretty sure there is no firewall running on my computer or router, and the internet (from a browser) works fine.

Paul Rooney
  • 20,879
  • 9
  • 40
  • 61
quilby
  • 3,141
  • 4
  • 26
  • 21

6 Answers6

6

The problem, in my case, was that some install at some point defined an environment variable http_proxy on my machine when I had no proxy.

Removing the http_proxy environment variable fixed the problem.

Declan Brennan
  • 360
  • 3
  • 5
5

The site's DNS record is such that Python fails the DNS lookup in a peculiar way: it finds the entry, but zero associated IP addresses. (Verify with nslookup.) Hence, 11004, WSANO_DATA.

Prefix the site with 'www.' and try the request again. (Use nslookup to verify that its result is different, too.)

This fails essentially the same way with the Python Requests module:

requests.exceptions.ConnectionError: HTTPConnectionPool(host='...', port=80): Max retries exceeded with url: / (Caused by : [Errno 11004] getaddrinfo failed)

JimB
  • 971
  • 8
  • 20
2

This may not help you if it's a network-level issue but you can get some debugging info by setting debuglevel on httplib. Try this:

import urllib, urllib2, httplib

url = 'http://www.mozillazine.org/atom.xml'
httplib.HTTPConnection.debuglevel = 1

print "urllib"

data = urllib.urlopen(url);

print "urllib2"

request = urllib2.Request(url)
opener = urllib2.build_opener()
feeddata = opener.open(request).read()

Which is copied directly from here, hope that's kosher: http://bytes.com/topic/python/answers/517894-getting-debug-urllib2

beer_monk
  • 946
  • 7
  • 7
1

You probably need to use a proxy. Check your normal browser settings to find out which. Take a look at opening websites using urllib2 from behind corporate firewall - 11004 getaddrinfo failed for a similar problem with solution.,

Community
  • 1
  • 1
xeor
  • 5,301
  • 5
  • 36
  • 59
  • 1
    Im not behind a corporate firewall, and in my browser there are no proxies defined. – quilby Feb 16 '11 at 22:38
  • Can you try 'telnet www.mozillazine.org' then type 'GET / http1.1' and press enter. Does it show you any html output from that site? Also try to ping the magic host 'wpad' which autoproxy tries to use. The full url should be 'http://wpad/wpad.dat' for autoproxy.. – xeor Feb 19 '11 at 07:31
1

To troubleshoot the issue:

  1. let us know on what OS is the script running and what version of Python
  2. In command prompt on that very same machine, do ping google.com and observe if that works (or you get say "could not find host")
  3. If (2) worked, open browser on that machine (try in IE if on Windows) and try opening "google.com" there. If there is a problem, look closely at proxy settings in Internet Options / Connections / LAN Settings

Let us know how it goes either way.

Nas Banov
  • 28,347
  • 6
  • 48
  • 67
0

add s to the http i.e urllib2.urlopen('https://google.com')

worked for me

tekstar
  • 51
  • 10