63

I have the following code to do a postback to a remote URL:

request = urllib2.Request('http://www.example.com', postBackData, { 'User-Agent' : 'My User Agent' })

try: 
    response = urllib2.urlopen(request)
except urllib2.HTTPError, e:
    checksLogger.error('HTTPError = ' + str(e.code))
except urllib2.URLError, e:
    checksLogger.error('URLError = ' + str(e.reason))
except httplib.HTTPException, e:
    checksLogger.error('HTTPException')

The postBackData is created using a dictionary encoded using urllib.urlencode. checksLogger is a logger using logging.

I have had a problem where this code runs when the remote server is down and the code exits (this is on customer servers so I don't know what the exit stack dump / error is at this time). I'm assuming this is because there is an exception and/or error that is not being handled. So are there any other exceptions that might be triggered that I'm not handling above?

davidmytton
  • 38,604
  • 37
  • 87
  • 93

5 Answers5

65

Add generic exception handler:

request = urllib2.Request('http://www.example.com', postBackData, { 'User-Agent' : 'My User Agent' })

try: 
    response = urllib2.urlopen(request)
except urllib2.HTTPError, e:
    checksLogger.error('HTTPError = ' + str(e.code))
except urllib2.URLError, e:
    checksLogger.error('URLError = ' + str(e.reason))
except httplib.HTTPException, e:
    checksLogger.error('HTTPException')
except Exception:
    import traceback
    checksLogger.error('generic exception: ' + traceback.format_exc())
vartec
  • 131,205
  • 36
  • 218
  • 244
  • is `checksLogger.error` a user defined function in your example? – codingknob Apr 06 '13 at 21:12
  • 1
    @algotr8der: yeah, it's just copy'n'paste of logging in the question – vartec Apr 06 '13 at 22:54
  • socket.error (or its parent IOError) is another exception that you could usefully catch explicitly. e.g. https://stackoverflow.com/questions/20568216/python-handling-socket-error-errno-104-connection-reset-by-peer – D Read Jan 08 '16 at 17:21
20

From the docs page urlopen entry, it looks like you just need to catch URLError. If you really want to hedge your bets against problems within the urllib code, you can also catch Exception as a fall-back. Do not just except:, since that will catch SystemExit and KeyboardInterrupt also.

Edit: What I mean to say is, you're catching the errors it's supposed to throw. If it's throwing something else, it's probably due to urllib code not catching something that it should have caught and wrapped in a URLError. Even the stdlib tends to miss simple things like AttributeError. Catching Exception as a fall-back (and logging what it caught) will help you figure out what's happening, without trapping SystemExit and KeyboardInterrupt.

Victor Schröder
  • 6,738
  • 2
  • 42
  • 45
DNS
  • 37,249
  • 18
  • 95
  • 132
15
$ grep "raise" /usr/lib64/python/urllib2.py
IOError); for HTTP errors, raises an HTTPError, which can also be
        raise AttributeError, attr
                raise ValueError, "unknown url type: %s" % self.__original
        # XXX raise an exception if no one else should try to handle
        raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
        perform the redirect.  Otherwise, raise HTTPError if no-one
            raise HTTPError(req.get_full_url(), code, msg, headers, fp)
                raise HTTPError(req.get_full_url(), code,
            raise HTTPError(req.get_full_url(), 401, "digest auth failed",
                raise ValueError("AbstractDigestAuthHandler doesn't know "
            raise URLError('no host given')
            raise URLError('no host given')
            raise URLError(err)
        raise URLError('unknown url type: %s' % type)
        raise URLError('file not on local host')
            raise IOError, ('ftp error', 'no host given')
            raise URLError(msg)
            raise IOError, ('ftp error', msg), sys.exc_info()[2]
            raise GopherError('no host given')

There is also the possibility of exceptions in urllib2 dependencies, or of exceptions caused by genuine bugs.

You are best off logging all uncaught exceptions in a file via a custom sys.excepthook. The key rule of thumb here is to never catch exceptions you aren't planning to correct, and logging is not a correction. So don't catch them just to log them.

Steven Huwig
  • 20,015
  • 9
  • 55
  • 79
1

You can catch all exceptions and log what's get caught:

 import sys
 import traceback
 def formatExceptionInfo(maxTBlevel=5):
     cla, exc, trbk = sys.exc_info()
     excName = cla.__name__
     try:
         excArgs = exc.__dict__["args"]
     except KeyError:
         excArgs = "<no args>"
     excTb = traceback.format_tb(trbk, maxTBlevel)
     return (excName, excArgs, excTb)
 try:
     x = x + 1
 except:
     print formatExceptionInfo()

(Code from http://www.linuxjournal.com/article/5821)

Also read documentation on sys.exc_info.

Eugene Morozov
  • 15,081
  • 3
  • 25
  • 32
  • Better to use "except Exception:" so you don't catch errors that are going to cause problems in your except handler. – S.Lott Mar 20 '09 at 13:09
  • Better to not catch exceptions at all if you're just logging them -- see my answer. – Steven Huwig Mar 20 '09 at 13:15
  • @Steven Huwig: yes, but I find that even using excepthook is clumsy - I'd better added some logging on the server, for example, somescript.py 2>/var/tmp/scrape.log – Eugene Morozov Mar 20 '09 at 13:22
0

I catch:

httplib.HTTPException
urllib2.HTTPError
urllib2.URLError

I believe this covers everything including socket errors.

Yuval Pruss
  • 8,716
  • 15
  • 42
  • 67
Corey Goldberg
  • 59,062
  • 28
  • 129
  • 143
  • 2
    `urllib2.HTTPError` is a subclass of `urllib2.URLError`, so catching the second one is enough – pictuga Sep 15 '13 at 15:24