I'm trying to use Selenium & Python to scrape a website (http://epl.squawka.com/english-premier-league/06-03-2017/west-ham-vs-chelsea/matches). I am using the webdriver to click a heading and then wait for the new information to load before clicking on an object before scraping the resulting data (which loads from the clicking). My problem is that I keep on getting an 'Unable to locate element error.
I've taken a screenshot at this point and can physically see the element and I've also printed the entire source code and can see that the element is there.
driver.find_element_by_id("mc-stat-shot").click()
time.sleep(3)
driver.save_screenshot('test.png')
try:
element = WebDriverWait(driver, 10).until(EC.presence_of_element_located((By.ID,"svg")))
finally:
driver.find_element_by_xpath("//g[3]/circle").click()
time.sleep(1)
goalSource = driver.page_source
goalBsObj = BeautifulSoup(goalSource, "html.parser")
#print(goalBsObj)
print(goalBsObj.find(id="tt-mins").get_text())
print(goalBsObj.find(id="tt-event").get_text())
print(goalBsObj.find(id="tt-playerA").get_text())
and the result is an error: "selenium.common.exceptions.NoSuchElementException: Message: Unable to locate element: //g[3]/circle"