I've come across a problematic page that causes Selenium Chrome (selenium version 3.10.0 in python 3, chromedriver Version 2.35.528157) on MacOSX to time out, I think because there is something indefinitely loading on the page. The problem is that after that timeout, all future requests to the driver to .get() a new url also fail with a timeout, even if they worked before. In fact, observing the browser it is never sent to the new url. This, of course, renders the browser useless for further sessions.
How can I "reset" the driver so that I can carry on using it? Or failing that, how can I debug why the .get() command doesn't seem to work after visiting the problematic page. The code and my output are below (problematic page is http://coastalpathogens.wordpress.com/2012/11/25/onezoom/
: I'd be interested if other people see the same thing, and with other pages too
from selenium import webdriver
from selenium.common.exceptions import TimeoutException
browser = webdriver.Chrome()
browser.set_page_load_timeout(10)
browser.implicitly_wait(1)
for link in ("http://www.google.com", "http://coastalpathogens.wordpress.com/2012/11/25/onezoom/","http://www.google.com"):
try:
print("getting {}".format(link))
browser.get(link)
print("done!")
except TimeoutException:
print("Timed out")
continue
result:
getting http://www.google.com
done!
getting http://coastalpathogens.wordpress.com/2012/11/25/onezoom/
Timed out
getting http://www.google.com
Timed out