I have a python scraping script.
At many points, the code has to interact with websites.
Sometimes the code crashes because my wifi signal drops.
I have been told that I should use:
try: except urllib2.URLError as e:
as a block around the problem parts of the code. But the code has many lines that interact with websites, with other things in between.
My question is, do I need to use this block around every line of code that interacts with a website? Or is there something else fancy that I could do that would just tell the whole script "if you're having a problem connecting to the internet at any point, just wait 10 seconds and try that line again".