I am trying to get the data I got as a result of scraping into my mySQL database. The only problem is I can't figure out this error.
mysql.connector.errors.ProgrammingError: 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ''news_tb' ('title', 'summary', 'date') VALUES (' Price Analysis 5/22: BTC, ETH, ' at line 1
Code:
from urllib.request import urlopen
from bs4 import BeautifulSoup
import mysql.connector
import datetime
# connecto to the database
cnx = mysql.connector.connect(
user='root',
password='password',
host='localhost',
database='news_db'
)
cursor = cnx.cursor()
sql = "INSERT INTO 'news_tb' ('title', 'summary', 'date') VALUES (%s, %s, %s)"
def crawl_url(article_arr):
url = "https://cointelegraph.com/tags/bitcoin"
html = urlopen(url)
soup = BeautifulSoup(html, "html.parser")
## Scrape articles
articles = soup.findAll("article", {"class":"post-card-inline"})
try:
for article in articles:
title = article.find("span", {"class":"post-card-inline__title"}).get_text()
summary = article.find("p", {"class":"post-card-inline__text"}).get_text()
date = datetime.datetime.today()
article_arr.append((title, summary, date))
finally:
return article_arr
article_arr = crawl_url([])
#print(len(article_arr))
cursor.executemany(sql, article_arr)
cnx.commit()
cursor.close()
cnx.close()
Can someone help me with my code? Not sure what I am doing wrong. I have exported earlier to a .csv
file, and the data that I am scraping should be fine.