0

I'm trying to crawl a site called Tirerack with selenium.

from selenium import webdriver
from selenium.webdriver.common.keys import Keys
import time
import urllib.request

driver=webdriver.Chrome()
driver.get("https://www.tirerack.com/tires/TireSearchResults.jsp?tireIndex=0&autoMake=Audi&autoYear=2021&autoModel=Q5+2.0T&autoModClar=Premium&width=235%2F&ratio=60&diameter=18&sortCode=53910&skipOver=true&minSpeedRating=H&minLoadRating=S")
time.sleep(10)
driver.find_element_by_id("oe-Tires").click()

Later, I tried to code the desired part step by step, but found that it did not work.

And, Finally, I found that I need to put a value of Cookie and Authority as request header.

I checked the value of header..

https://i.stack.imgur.com/FyrPK.png

But I can't find a way of coding.. So I uploaded this question.

Can I get some hints?

장재익
  • 9
  • 1
  • what do you want to do particularly?add cookies or put header to your opened browser?or get header and print out? – Vova Apr 01 '21 at 08:16
  • Take a look at this https://stackoverflow.com/questions/15645093/setting-request-headers-in-selenium – AliBaharni97 Apr 01 '21 at 08:43

1 Answers1

0

I dont think you need any header.

there is an overlay of label over the input box. you can try to click the label.

from selenium import webdriver

driver=webdriver.Chrome()
driver.get("https://www.tirerack.com/tires/TireSearchResults.jsp?tireIndex=0&autoMake=Audi&autoYear=2021&autoModel=Q5+2.0T&autoModClar=Premium&width=235%2F&ratio=60&diameter=18&sortCode=53910&skipOver=true&minSpeedRating=H&minLoadRating=S")
driver.find_elements_by_xpath('//section/div/aside/section/ul/li/label[@for="oe-Tires"]')[0].click()
Yash
  • 1,271
  • 1
  • 7
  • 9
  • First of all, thank you for your answer. The reason why I thought I had to process headers or cookies was because there was no change when I clicked the button in the oe-Tires section after the page came up when I ran coding. If you enter the corresponding url and press the button corresponding to the oetires, the page is sorted and loaded with the corresponding conditions. So, can you give me an another ideas? With your answer, the situation can't be solved. – 장재익 Apr 01 '21 at 10:50
  • I think that cookies will not solve your issue. it seems the website has used preventive measures to detect bots and deny access to them. – Yash Apr 01 '21 at 11:38