-1

I have just started with selenium. Now I have all the links that I want to click.This links lead me to a Page that I want to scrape. When the driver click on the first link it works perfectly fine but on second link click I get the message as :

Message: stale element reference: element is not attached to the page document

def view_All_pages(driver , file_object):
    #All the Links that i want to click
    list_of_links = driver.find_elements_by_xpath("//a[@class='@ClassName']")
    for link in list_of_links:
        time.sleep(10)
        link.click()  #Getting the error here
        scraping_Normal_page(driver , file_object)
        driver.back()

I want to scrape the the page and come back to the opening page and then click on to the next link. Help will be appreciated

Dhruvit
  • 1
  • 2

1 Answers1

0

Stale element Reference Exception is thrown when reference to an (already found and saved) element changes. In your case you navigate to a new page but your list contains saved elements that are part of the parent page. So when you navigate back the elements in the list becomes stale.

This can be solved by re-initializing the element / list on navigating pack from the scraped web page. You have to change the iteration over the list to iterate over index.

def view_All_pages(driver , file_object):
    #All the Links that i want to click
    list_of_links = driver.find_elements_by_xpath("//a[@class='@ClassName']")
    length_of_list  = len(list_of_links)
    # Iterate over the list based on index
    i = 0
    while i < len(length_of_list):
        time.sleep(10)
        list_of_links.index(i).click()
        scraping_Normal_page(driver , file_object)
        driver.back()
        i += 1
        # Reinitialize the list of elements again
        list_of_links = driver.find_elements_by_xpath("//a[@class='@ClassName']")
StrikerVillain
  • 3,719
  • 2
  • 24
  • 41