0

I've written a web scraper using the python requests library which mimics the browsers XHR requests to the scraper website. However requests generates InsecureRequestsWarning when connecting to the site. I have set verify=False so that I can at least connect. However for every request it generates SSL errors. Is there a way to suppress these errors? (I'm not clear why there are SSL errors are the domains certificate looks valid when checked via an SSL checker - does requests break down what the SSL issue is somehow?)

InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
  InsecureRequestWarning)
Yunti
  • 6,761
  • 12
  • 61
  • 106
  • 1
    How is this a duplicate of the [linked question](http://stackoverflow.com/questions/10667960/python-requests-throwing-up-sslerror)? This question is about suppressing the warning for unverified requests, the other question is about a specific SSL error - which might be a totally different one than the one that motivated OP to use `verify=False`. Voting to reopen. – Lukas Graf Nov 04 '15 at 19:08
  • How to suppress those warnings is actually documented in the URL that's given in the warning: https://urllib3.readthedocs.org/en/latest/security.html#disabling-warnings - however, you should first try to use the `security` extra (e.g. `pip install requests[security]`), you shouldn't be needing to use `verify=False`. – Lukas Graf Nov 04 '15 at 19:12
  • Thanks Lucas - Yes I didn't find the linked question particularly useful - so not sure why it was marked as duplicate. Bizarrely when I try to install pip install requests[security] I get no matches found? – Yunti Nov 05 '15 at 17:41
  • got it - it needed quotes. - Unfortunately that did solve the issue. Looks like I will just have to suppress the warnings – Yunti Nov 05 '15 at 17:49

0 Answers0