0

Scraping webpages with Python 3.8, Selenium and BeautifulSoap, I would like to remove or alter some elements. Since not all pages contain the respective elements, I have to catch exceptions:

try:
    soup.find('aside', id="post").decompose()
except Exception:
    pass
try:
    soup.find('footer', id="footer").decompose()
except Exception:
    pass
try:
    soup.find(class_="myclass")["class"] = ''
except Exception:
    pass

There is a lot of repetition in this code (my list of statements is even longer), so I tried to build a block:

try:
    soup.find('aside', id="post").decompose()
    soup.find('footer', id="footer").decompose()
    soup.find(class_="myclass")["class"] = ''
except Exception:
    pass

But this isn't what I want to achieve, because if first statement doesn't catch a match, then the following statements aren't evaluated at all. What's a good, pythonic and elegant way to execute/evaluate all statements? I read, that using pass is bad practice also. Maybe try isn't the correct thing here at all and would be better off using something like isset() in PHP (but in python I don't know the eqivalent)?

Madamadam
  • 842
  • 2
  • 12
  • 24

1 Answers1

0

Not an ideal solution, but you can decorate functions to ignore exceptions and then use decorated functions instead of originals:

from functools import wraps

def exceptions_ignored(callee):
    @wraps(callee)
    def _ignore(*args, **kwargs):
        try:
            return callee(*args, **kwargs)
        except Exception:
            pass
    return _ignore

mydivmod = exceptions_ignored(divmod)
# or define it as 
# @exceptions_ignored
# def mydivmod(n, d):
#     return divmod(n, d)

mydivmod(5, 0)
khachik
  • 28,112
  • 9
  • 59
  • 94