0

I am trying to just use the pretrained mask_rcnn_R_50_FPN_3x model in detectron2 on an image. I get the error ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1131).

I am using Windows Subsystem for Linux. The following code produces the error.

from detectron2.config import get_cfg
from detectron2 import model_zoo
from detectron2.engine import DefaultPredictor

cfg = get_cfg()
cfg.merge_from_file(model_zoo.get_config_file("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml"))
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-InstanceSegmentation/mask_rcnn_R_50_FPN_3x.yaml")
cfg.MODEL.DEVICE='cpu'

predictor = DefaultPredictor(cfg)

I've tried updating the certifi package.
I've tried

sudo apt install ca-certificates
sudo update-ca-certificates --fresh
export SSL_CERT_DIR=/etc/ssl/certs

based on one of the answers here: https://stackoverflow.com/questions/52805115/certificate-verify-failed-unable-to-get-local-issuer-certificate.\ I've tried downloading the certificates for https://dl.fbaipublicfiles.com (by, in Google Chrome, clicking the padlock symbol -> 'Connection is secure' -> 'Certificate is valid' -> 'Details' -> 'Copy to file', and then doing the same thing for the different certificates under the 'Certification Path' tab) and copying their contents into the cacert.pem file.


UPDATE:
It seems to have something to do with the urllib.request module (altough I might be misunderstanding things). I have found that

from urllib import request
request.urlretrieve('https://dl.fbaipublicfiles.com')

(the urlretrive function is called by detectron2) results in the same error, whereas

import requests
requests.get('https://dl.fbaipublicfiles.com')

works fine.

  • The certificate for this site is fine. Make sure that there is no SSL intercepting component in the communication path, like a corporate proxy or in case of WSL an SSL intercepting antivirus on the Windows system. – Steffen Ullrich Aug 08 '22 at 14:30
  • Thank you for your reply. Any chance you could give some pointers on how to check this? – user3221037 Aug 08 '22 at 14:39
  • Try to use `openssl s_client -connect dl.fbaipublicfiles.com:443 | openssl x509 -text -noout` from inside WSL and see what it reports as certificate issuer. Should be "CN = DigiCert SHA2 High Assurance Server CA". – Steffen Ullrich Aug 08 '22 at 15:47
  • Can you run `python -c 'import certifi; print(certifi.where())'` and post result ? – Philippe Aug 08 '22 at 20:28
  • Absolutely, it prints: `/home//detectron2/detectronenv/lib/python3.8/site-packages/certifi/cacert.pem`. – user3221037 Aug 09 '22 at 11:19

2 Answers2

2

Try to put this after your import statements :

import certifi
import ssl

def create_context():
    context = ssl.SSLContext(ssl.PROTOCOL_TLS_CLIENT)
    context.load_verify_locations(certifi.where())
    return context
ssl._create_default_https_context = create_context

This tells urllib to use certifi's certificates.

Philippe
  • 20,025
  • 2
  • 23
  • 32
0

Solution:

import ssl
ssl._create_default_https_context = ssl._create_unverified_context
  • 1
    Hi blackthorn, welcome to Stack Overflow and thank you for your answer! I do not think reassigning `_underscore_prefixed` variables is a good way to approach this. According to https://peps.python.org/pep-0008/#descriptive-naming-styles, such variables are indicated to be for internal use only. They are not the part of the public interface -- so, if `ssl` library were to be updated in the future, the names. layout or even existence of such variables can change, because authors rely on the fact those are not used, which makes this approach extremely brittle. – mingaleg Feb 06 '23 at 21:27
  • Hi mingaleg, thanks for the note, I fully agree with your remark for future use! But so far this was the only solution that allowed me to quickly fix the code and get everything up and running :) – blackthorn Feb 08 '23 at 09:39