I am new to the web crawling task. Previously I tried the following simple crawler, and it worked well. Recently I come back to the code and tried to do more on crawler, however the browser.find_element_by_id("lst-ib") does not work and I receive the error that says
' no such element: Unable to locate element: {"method":"css selector","selector":"[id="lst-ib"]"} (Session info: chrome=84.0.4147.89) '
To solve my problem, I tried to find xpath of input text box for google page from inspect. Is it always like that? does the id or css selector that we define for crawler change regularly and we should update the code?
from selenium import webdriver
url = "https://www.google.com"
browser = webdriver.Chrome(executable_path = "chromedriver")
browser.get(url)
#inputElement = browser.find_element_by_id("lst-ib")
# I replace the xpath with previous id
inputElement =
browser.find_element_by_xpath("/html/body/div/div[2]/form/div[2]/div[1]/div[1]/div/div[2]/input")
inputElement.send_keys("my input search text")
inputElement.submit()
browser.quit()