I'm trying to get scraped Chinese text into a MYSQL database from python scrapy, but it seems that either scrapy or MYSQL can't handle Chinese characters using my current method.
def insert_table(datas):
sql = "INSERT INTO %s (name, uses, time_capt) \
values('%s', '%s', NOW())" % (SQL_TABLE,
escape_string(datas['name']),
escape_string(datas['uses']),
)
if cursor.execute(sql):
print "Inserted"
else:
print "Something wrong"
I keep getting this error when adding into the MYSQL database:
exceptions.UnicodeEncodeError: 'ascii' codec can't encode characters in position 1-7: ordinal not in range(128)
datas['name']
contains correctly formatted Chinese characters. If I print the variable within the code, it comes out properly. I've tried adding .encode['utf8']
before adding it into MYSQL, which makes the error go away, but that makes it come out as garbled gibberish within my database. Am I doing something wrong?
Edit, This is my current code:
15 def insert_table(datas):
16 sql = "INSERT INTO %s (name, uses, time_capt) \
17 values(%s, %s, NOW())", (SQL_TABLE,
18 escape_string(datas['name']),
19 escape_string(datas['uses']),
20 )
21 if cursor.execute(sql):
22 print "Inserted"
23 else:
24 print "Something wrong"