I'm testing a site with lots of proxies, and the problem is some of those proxies are awfully slow. Therefore my code is stuck at loading pages every now and then.
from selenium import webdriver
browser = webdriver.Firefox()
browser.get("http://example.com/example-page.php")
element = browser.find_element_by_id("someElement")
I've tried lots of stuff like explicit waits
or implicit waits
and been searching around for quite a while but still not yet found a solution or workaround. Nothing seems to really affect page loading line browser.get("http://example.com/example-page.php")
, and that's why it's always stuck there.
Anybody got a solution for this?
Update 1:
JimEvans' answer solved my previous problem, and here you can find python patch for this new feature.
New problem:
browser = webdriver.Firefox()
browser.set_page_load_timeout(30)
browser.get("http://example.com/example-page.php")
element = browser.find_element_by_id("elementA")
element.click() ## assume it's a link to a new page http://example.com/another-example.php
another_element = browser.find_element_by_id("another_element")
As you can see browser.set_page_load_timeout(30)
only affects browser.get("http://example.com/example-page.php")
which means if this page loads for over 30 seconds it will throw out a timeout exception, but the problem is that it has no power over page loading such as element.click()
, although it does not block till the new page entirely loads up, another_element = browser.find_element_by_id("another_element")
is the new pain in the ass, because either explicit waits
or implicit waits
would wait for the whole page to load up before it starts to look for that element. In some extreme cases this would take even HOURS. What can I do about it?