0

I'm getting stuck in Python Selenium with angular site scraping it's showing exception selenium.common.exceptions.NoSuchElementException when I click on its any link, chrome driver shows the result came into driver but when I try to find any element that came in the result python selenium showing exception.

driver.get("https://recruiting.adp.com/srccar/public/RTI.home?c=1153651&d=ExternalCareerSite&rb=ConstellisSite#/")
job_elements = WebDriverWait(driver, 20).until(EC.visibility_of_all_elements_located(
                (By.XPATH, "//a[@class='jobtitle job-title-link']")))
job = job_elements[0]
job.click()
//after some wait when elements visible into driver
driver.find_element_by_xpath("//span[@class='jobTitle job-detail-title']").text

then exception shows so how can I extract elements that came after click in agular. help will be appreciated.

Rana. Amir
  • 187
  • 3
  • 15

1 Answers1

0

You were close enough. Just like the way you induced WebDriverWait for visibility_of_all_elements_located() before collecting the desired elements, similarly to extract the text K-9 Handler (Boston/Part Time) you have to induce WebDriverWait for the visibility_of_element_located() and you can use either of the following Locator Strategies:

  • Using XPATH and text attribute:

    driver.get('https://recruiting.adp.com/srccar/public/RTI.home?c=1153651&d=ExternalCareerSite&rb=ConstellisSite#/')
    job_elements = WebDriverWait(driver, 20).until(EC.visibility_of_all_elements_located((By.XPATH, "//a[@class='jobtitle job-title-link']")))
    job_elements[0].click()
    print(WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.XPATH, "//span[@class='jobTitle job-detail-title']"))).text)
    
  • Using CSS_SELECTOR and get_attribute() method:

    driver.get('https://recruiting.adp.com/srccar/public/RTI.home?c=1153651&d=ExternalCareerSite&rb=ConstellisSite#/')
    job_elements = WebDriverWait(driver, 20).until(EC.visibility_of_all_elements_located((By.XPATH, "//a[@class='jobtitle job-title-link']")))
    job_elements[0].click()
    print(WebDriverWait(driver, 20).until(EC.visibility_of_element_located((By.CSS_SELECTOR, "span.jobTitle.job-detail-title"))).get_attribute("innerHTML"))
    
  • Console Output:

    K-9 Handler (Boston/Part Time)
    
  • Note : You have to add the following imports :

    from selenium.webdriver.support.ui import WebDriverWait
    from selenium.webdriver.common.by import By
    from selenium.webdriver.support import expected_conditions as EC
    

References

You can find a couple of relevant discussions on no such element in:

undetected Selenium
  • 183,867
  • 41
  • 278
  • 352
  • this works only Python console but not working when run as spider. :( – Rana. Amir Jul 05 '20 at 14:23
  • @Rana.Amir What do you mean by `spider`? I don't see anything mentioned in your question regarding `spider`? Did you run the code with inmy answer? Can you let me know the status? – undetected Selenium Jul 05 '20 at 21:30
  • I meant when I ran this as a python scrapy spider with command i.e `scrapy crawl spider_name` it not giving me any element after when I clicked on any link. – Rana. Amir Jul 07 '20 at 09:30