1

I have created multiple python scripts using selenium that will generate a web page that contains a data table that is almost 100,000 rows and 25 columns. One example of this code can be found here. Afterwards, I intend to scrape that table with beautiful soup. However I time and again get this error message.

selenium.common.exceptions.TimeoutException: Message: timeout: Timed out receiving message from renderer: -0.001
  (Session info: chrome=81.0.4044.113)

Is there a reason why this message keeps popping up? Is there a way to fix the code so that my webpage can keep downloading and allow me to scrape it?

Any assistance is truly appreciated.


So I have the following code to transition from one webpage to another, but I still get the same exact timeout error.

driver.switch_to.window(driver.window_handles[1])
WebDriverWait(driver, 60).until(EC.visibility_of_element_located((By.XPATH, '/html/body/p[2]/text()[2]')))
df_url = driver.current_url

what am I missing out of this that will make it no longer timeout?

Ryan M
  • 18,333
  • 31
  • 67
  • 74
BLuta
  • 243
  • 1
  • 10
  • 1
    You can use a `webdriver wait`. So that the selenium will wait untill/unless the page loads in the given period of time. I suggest you googling it. Also use `--headless` chrome for faster loading of the page and use `beautifulSoup` with selenium for faster scraping. – Abhay Salvi Apr 18 '20 at 01:58

0 Answers0