I'm doing my school project and is trying to scrape data from websites using Python. I followed a tutorial on edureka - https://www.edureka.co/blog/web-scraping-with-python/#demo
This is the sample code
from selenium import webdriver
from bs4 import BeautifulSoup
import pandas as pd
driver = webdriver.Chrome("D:\COSC2625_Team_Blue\chromedriver")
products=[] #List to store name of the product
prices=[] #List to store price of the product
ratings=[] #List to store rating of the product
driver.get("""<a href="https://www.flipkart.com/laptops/">https://www.flipkart.com/laptops/</a>~buyback-guarantee-on-laptops-/pr?sid=6bo%2Cb5g&amp;amp;amp;amp;amp;amp;amp;amp;uniq""")
content = driver.page_source
soup = BeautifulSoup(content)
for a in soup.findAll('a',href=True, attrs={'class':'_31qSD5'}):
name=a.find('div', attrs={'class':'_3wU53n'})
price=a.find('div', attrs={'class':'_1vC4OE _2rQ-NK'})
rating=a.find('div', attrs={'class':'hGSR34 _2beYZw'})
products.append(name.text)
prices.append(price.text)
ratings.append(rating.text)
df = pd.DataFrame({'Product Name':products,'Price':prices,'Rating':ratings})
df.to_csv('products.csv', index=False, encoding='utf-8')
After running the code, a blank web browser shown up, and there is a error message in the terminal -
PS D:\COSC2625_Team_Blue> [8556:2544:0809/132114.260:ERROR:device_event_log_impl.cc(214)] [13:21:14.260] USB: usb_device_handle_win.cc:1048 Failed to read descriptor from node connection: A device attached to the system is not functioning. (0x1F)
I searched this online and somebody said that it doesn't stop the code from running, so I waited for a while and another message appeared.
[4488:14404:0809/132312.207:ERROR:gpu_init.cc(486)] Passthrough is not supported, GL is disabled, ANGLE is
Then the whole programme is stuck, I waited for an hour but there are still these 2 messages, and the blank web browser is still there.
Does anyone know what happened here?