1

I need to find if a website is taking too long to respond or not.

For example, i need to identify this website as problematic: http://www.lowcostbet.com/

I am trying something like this:

print urllib.urlopen("http://www.lowcostbet.com/").getcode() 

but i am getting Connection timed out

My objective is just create a routine to identify what websites are taking too long to load. (e.g. 4 seconds, and cancel the request)

user2990084
  • 2,699
  • 9
  • 31
  • 46
  • 2
    Is there something wrong with the timeout parameter of `urlopen()`? – MrAlexBailey Oct 23 '15 at 14:35
  • possible duplicate of: http://stackoverflow.com/questions/492519/timeout-on-a-python-function-call – Flavio Ferrara Oct 23 '15 at 14:36
  • Possible duplicate of [How to implement a timeout control for urlllib2.urlopen](http://stackoverflow.com/questions/16018007/how-to-implement-a-timeout-control-for-urlllib2-urlopen) –  Oct 23 '15 at 14:47

3 Answers3

6

urlopen from urllib2 package has timeout param.

You can use something like this:

from urllib2 import urlopen

TO = 4
website = "http://www.lowcostbet.com/"

try:
    response = urlopen(website, timeout=TO)
except:
    mark_as_not_responsive(website)

UPD:

Please, note that using my snippet as-is suck because you'll catch all kind of exceptions, not just timeouts here. And probably, you need to make several tries before marking website as non-responsive.

anti1869
  • 1,219
  • 1
  • 10
  • 18
0

also, requests.get has a timeout kwarg you can pass in. from the docs:

requests.get('http://github.com', timeout=0.001)

this will raise an exception, so you probably want to handle that.

http://docs.python-requests.org/en/latest/user/quickstart/

acushner
  • 9,595
  • 1
  • 34
  • 34
0

The timeout value will be applied to both the connect and the read timeouts. Specify a tupleif would like to set the values separately:

import requests

try:
    r = requests.get('https://github.com', timeout=(6.05, 27))
except requests.Timeout:
    ...
except requests.ConnectionError:
    ...
except requests.HTTPError:
    ... 
except requests.RequestException:
    ...
else:
    print(r.text)
debug
  • 991
  • 10
  • 14