0

I am trying to load the page with Selenium and parse it with Beautiful Soup. I have tried different ways to simulate the click on the load more button, only the code here works. [How to scroll down in Python Selenium step by step

read_mores = driver.find_elements_by_xpath('//*[@data-teach-id='+ tid + ']')
for read_more in read_mores:
    driver.execute_script("arguments[0].scrollIntoView();", read_more)
    driver.execute_script("$(arguments[0]).click();", read_more)
soup = BeautifulSoup(driver.page_source, 'html.parser')

However, it fails to load the whole page sometimes. I know probably "clickandwait" will work but have no idea where to put in the code. I would also love to know if there is other way to deal with it. Any help would be greatly appreciated!

Chloe
  • 3
  • 2

1 Answers1

0

I recommend to do:

driver.execute_script("document.querySelector('[data-teach-id=\"" + tid + "\"]').click()")

Rather than select in selenium and pass the element to the browser context, do it all in the browser context. Less possiblity for things to go horribly wrong that way.

Just to get fancy...

Let's abstract the click to a function (I think I'll do this from now on):

def click(css):
  global driver
  driver.execute_script("document.querySelector('" + css + "').click()")

Now we can do:

click('a[data-teach-id="' + tid + '"]')

Ah, much less painful.

pguardiario
  • 53,827
  • 19
  • 119
  • 159
  • @Chloe Is it just loading some of the content thus pushing the page lower and you would have to click it again? If so, then run that in a loop. If the element is visible `while` you can click it then use that for your conditions. – Kamikaze_goldfish Nov 21 '18 at 21:40
  • Sorry, without an url I can't see what's going on there. – pguardiario Nov 21 '18 at 23:59