1

I have a list of domains that I would like to loop over and screenshot using selenium. However, the cookie consent column means the full page is not viewable. Most of them have different consent buttons - what is the best way of accepting these? Or is there another method that could achieve the same results?

urls for reference: docjournals.com, elcomercio.com, maxim.com, wattpad.com, history10.com

enter image description here

R.Sav
  • 63
  • 5
  • 1
    Plese provide the url as text instead of an image. – kaliiiiiiiii Jan 17 '23 at 12:47
  • Can you please read about [the problems with images images of text](//meta.stackoverflow.com/a/285557/11107541) and then [edit] to convert your images of tables into markup tables? See [/editing-help#tables](/editing-help#tables) for how. You might find [tablesgenerator.com](//www.tablesgenerator.com/markdown_tables) useful. – starball Jan 18 '23 at 00:10

3 Answers3

0

You'll need to click accept individually for every website. You can do that, using

from selenium.webdriver.common.by import By

driver.find_element(By.XPATH, "your_XPATH_locator").click()
kaliiiiiiiii
  • 925
  • 1
  • 2
  • 21
0

From the snapshot

snapshot

as you can observe diffeent urls have different consent buttons, they may vary with respect to:


Conclusion

There can't be a generic solution to accept/deny the cookie concent as at times:

undetected Selenium
  • 183,867
  • 41
  • 278
  • 352
0

To get around the XPATH selectors varying from page to page you can use driver.current_url and use the url to figure out which selector you need to use.
Or alternatively if you iterate over them anyways you can do it like this:

page_1 = {
    'url' : 'docjournals.com'
    'selector' : 'example_selector_1'
}

page_2 = {
    'url' = 'elcomercio.com'
    'selector' : 'example_selector_2'
}

pages = [page_1, page_2]
for page in pages:
    driver.get(page.url)
    driver.find_element(By.XPATH, page.selector).click()

Romek
  • 284
  • 2
  • 10