I`m a newbie to python. Recently I got interested in Web Crawling.
Today I got stuck in NoSuchElementException
This is the webpage that i want to scrape.
When I click the username that i erased, it returns box like this.
Though I used the xpath that i copied from Chrome developer tool, it returns me NoSuchElementException:
Message: no such element: Unable to locate element: {"method":"xpath","selector":"//*[@id="main-area"]/div[4]/table/tbody/tr[1]/td[2]/div/table/tbody/tr/td/a"}
(Session info: chrome=87.0.4280.88)
HTML is like this
<a href="#" class="m-tcol-c" onclick="ui(event, 'royaltina',3,'이주연마인','25868806','me', 'false', 'true', 'schoolch', 'false', '5'); return false;">이주연마인</a>
My code is just like this,
driver.find_element_by_xpath("//*[@id=\"main-area\"]/div[4]/table/tbody/tr[1]/td[2]/div/table/tbody/tr/td/a")
I checked there is this xpath, but when I get it into .find_element_by_xpath() method it returns Error.
I do really share the webpage, but it needs to log-in to get there, So i cannot share the webpage.
Could you guess what might cause this problem?
I checked time is not the problem. I checked iframe is not the problem.
Thank you in advance Have a great day!