I'm trying to do dynamic web scraping on a javascript-rendered webpage using Python.
1) However, the elements only load when I scroll down the page slowly.
I have tried:
driver.execute_script("window.scrollTo(0, Y)")
(this doesn't work because it only scrolls to a certain point on the page, missing out other results)
and
driver.execute_script("window.scrollTo(0, document.body.scrollHeight);")
(this doesn't work because the elements don't load when scrolled down to the end of the page - it requires the user to scroll through the entire page slowly)
2) How do I make Selenium wait for all my elements to be loaded before returning them to me?
I understand that this solution exists:
myElem = WebDriverWait(browser, delay).until(EC.presence_of_element_located((By.ID, 'IdOfMyElement')))
But how would this work if results are continuously appearing as the user scrolls down the page? Won't this code make Selenium stop once it detects the first occurrence of said element?