-1

I am trying to execute below code (reading content from a html) using FancyURLopener. The code was working fine for last 2 months or so , but now it has started to throw the error : IOError: [Errno socket error] [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)

When I try to run it locally , it works like a charm.

from urllib import urlopen

from urllib import FancyURLopener

from bs4 import BeautifulSoup

import requests

doc_name = "XYZ"

class MyOpener(FancyURLopener):

        version = 'Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.8.1.11) Gecko/20071127 Firefox/2.0.0.11'

mopen = MyOpener()

def extract_count_from_url(url, tag_name, tag_type, html_tag) :

      html = mopen.open(url).read() 
      soup = BeautifulSoup(html, "html.parser")

I have searched it over on stackoverflow and google. The answers I am getting is mostly to use urllib2 / urllib libraries and use user agents + set the context to ssl.CERT_NONE (How do I disable the ssl check in python 3.x?)

But I guess same is not applicable when I use FancyURLopener , as when I set context in the open() method along with url , it throws invalid arguments error.

python version = Python 2.7.12

Any leads would be helpful.

Thanks in advance.

1 Answers1

0

I was able to figure out a workaround. Have added the below part in the code and it bypassed the security.

import ssl

ssl._create_default_https_context = ssl._create_unverified_context