1

I am making a call to a URL in Python using urllib2.urlopen in a while(True) loop

My URL keeps changing every time (as there is a change in a particular parameter of the URL every time).

My code look as as follows:

def get_url(url):
    '''Get json page data using a specified API url'''
    response = urlopen(url)
    data = str(response.read().decode('utf-8'))
    page = json.loads(data)
    return page

I am calling the above method from the main function by changing the url every time I make the call.

What I observe is that after few calls to the function, suddenly (I don;t know why), the code gets stuck at the statement

response = urlopen(url)

and it just waits and waits...

How do I best handle this situation?

I want to make sure that if it does not respond within say 10 seconds, I make the same call again.

I read about

response = urlopen(url, timeout=10)

but then what about the repeated call if this fails?

London guy
  • 27,522
  • 44
  • 121
  • 179

2 Answers2

2

Depending on how many retries you want to attempt, use a try/catch inside a loop:

while True:
    try:
        response = urlopen(url, timeout=10)
        break
    except:
        # do something with the error
        pass
# do something with response
data = str(response.read().decode('utf-8'))
...  

This will silence all exceptions, which may not be ideal (more on that here: Handling urllib2's timeout? - Python)

Community
  • 1
  • 1
Tim D
  • 508
  • 4
  • 4
1

With this method you can retry once.

def get_url(url, trial=1):
    try:    
        '''Get json page data using a specified API url'''
        response = urlopen(url, timeout=10)
        data = str(response.read().decode('utf-8'))
        page = json.loads(data)
        return page
    except:
        if trial == 1:
            return get_url(url, trial=2)
        else:
            return
Yasser Hussain
  • 854
  • 7
  • 21