1

Possible Duplicate:
Given a big list of urls, what is a way to check which are active/inactive?

I recently had the idea of a very simple utility script that will test a website to see if it is online or not using two parameters: the url begin tested and the verbosity (how much packet loss can be accepted without raising an alert). I'd like to write the code in Python 2.7.3 and make it as simple as possible.

This is my pseudocode.

def urlCheck(url, verbosity):
    badcount = 0
    iteration = 0
    while iteration < 10:
                #ping code here
                if website is down:
                       badcount += 1
                iteration += 1
    if badcount > verbosity:
                print "Phooey. It appears your server is down."
    if badcount <= verbosity:
                print "Whew! Your server appears to be running smoothly."

My question is, what code should I place in the commented line? (#ping code here)

Thanks!

Community
  • 1
  • 1
BugBytes
  • 21
  • 3

1 Answers1

1

If you're stuck with using the standard library, urllib2 will do the job with a HEAD request:

import urllib2

class HeadRequest(urllib2.Request):
    def get_method(self):
        return "HEAD"

def url_is_reachable(url):
    response = urllib2.urlopen(HeadRequest(url))
    return response.getcode() == 200

The third party requests library makes it a lot nicer - no need to do that ugly method override:

import requests

def url_is_reachable(url):
    requests.head(url).status_code == requests.codes.ok
spencer nelson
  • 4,365
  • 3
  • 24
  • 22