0

I am trying to get data from MoleculeNet dataset of deepchem for my deep learning project. However loading the dataset causes errors due to urllib of Python3.6. Also, I am using Ubuntu 16.04.

import deepchem
tasks, datasets, transformers = deepchem.molnet.load_delaney(featurizer='GraphConv')

Below, you can find the error log and message:

File "/usr/lib/python3.6/urllib/request.py", line 1349, in do_open
    encode_chunked=req.has_header('Transfer-encoding'))
  File "/usr/lib/python3.6/http/client.py", line 1287, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1333, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1282, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1042, in _send_output
    self.send(msg)
  File "/usr/lib/python3.6/http/client.py", line 980, in send
    self.connect()
  File "/usr/lib/python3.6/http/client.py", line 1448, in connect
    server_hostname=server_hostname)
  File "/usr/lib/python3.6/ssl.py", line 407, in wrap_socket
    _context=self, _session=session)
  File "/usr/lib/python3.6/ssl.py", line 817, in __init__
    self.do_handshake()
  File "/usr/lib/python3.6/ssl.py", line 1077, in do_handshake
    self._sslobj.do_handshake()
  File "/usr/lib/python3.6/ssl.py", line 689, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)

I searched the problem and read about creating a new certificate at SSL: CERTIFICATE_VERIFY_FAILED on Ubuntu 18.04.1 using the commands below:

$ sudo update-ca-certificates --fresh
$ export SSL_CERT_DIR=/etc/ssl/certs

However this solution did not work for me and I still get the same error, maybe because I am on Ubuntu 16.04 as opposed to Ubuntu 18.04.

Trying urllib for sending request to Google works fine, below I get code 200.

import requests
requests.get("https://google.com")

Due to this problem, I cannot run my deep learning program and I have never faced this kind of security error before. I will appreciate any of your help.

selubamih
  • 83
  • 3
  • 15
  • 1
    Please read https://access.redhat.com/articles/2039753 and https://stackoverflow.com/questions/27835619/urllib-and-ssl-certificate-verify-failed-error – JohannesB Apr 10 '21 at 09:50
  • 1
    @JohannesB Thanks for the sources. `ssl._create_default_https_context = ssl._create_unverified_context` helped me to bypass the error. – selubamih Apr 10 '21 at 12:08

0 Answers0