0

I try to scrap this page to obtain a list with all repairer but I encounter some difficulties.

Currently, I have this error :

requests.exceptions.SSLError: HTTPSConnectionPool(host='www.renault-retail-group.fr', port=443): Max retries exceeded with url: /concessions-renault.html (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:777)'),))

My script :

import requests
from bs4 import BeautifulSoup
from selenium import webdriver
from random import randint
import time


url = "https://www.renault-retail-group.fr/concessions-renault.html"

chrome_path = r"C:\Users\XXX\Desktop\chromedriver_win32 (1)\chromedriver.exe"
driver = webdriver.Chrome(chrome_path)
driver.maximize_window()


try:
driver.get(url)
except TimeoutError:
driver.execute_script("window.stop();")


time.sleep((randint(2, 3)))


r = requests.get(url)
soup = BeautifulSoup(r.content, "html.parser")


g_data = soup.findAll("div", {"class": "audColResultatConcessionDetail"})


dict_name_r = []


for item in g_data:
dict_name_r(item.contents[1].findAll("h6", {"class": "audColResultatConcessionNom ng-binding"}))

print(dict_name_r)

Thanks a lot for your help

  • hello, i am not enough confortable with python but i think is [possible solution here](https://stackoverflow.com/a/39124754/9483405) – Yanis-git Apr 25 '18 at 12:54
  • You can just `driver.page_source` to get the HTML of the website if you want to use selenium. As for the error, it works fine on my machine. I don't know why it occurs in your case, but maybe [this](https://stackoverflow.com/questions/10667960/python-requests-throwing-sslerror) thread will help you. – radzak Apr 25 '18 at 14:44
  • I will test that, thank u :) – Julien Bourbon Apr 25 '18 at 15:19

0 Answers0