0
#!/usr/bin/env python
# -*- coding: utf-8 -*-

import mysql.connector
import json
import pandas as pd

con = mysql.connector.connect(user='',password='',host='',database='')
cursor = con.cursor()
sqlquery = "blahblahblah"
print sqlquery
searches = []
print searches
cursor.execute(sqlquery)
for row in cursor:
    try:
        searchname = row[0].encode("utf-8")
        queryobject = row[1].encode("utf-8")
    except:
        print('Query not found')
    json_dict = json.loads(queryobject)
    searches.append(json_dict)
    print searchs
pd.DataFrame(searches).to_csv('datefile.csv', index=False)

Hey, i'm trying to input the above rows into a CSV which is working, however due to what I assume is unicode, the output has a 'u' before every word. The data is fine up until json_dict = json.loads(queryobject). Any help would be greatly appreciated, this is driving me mad.

Michael Blair
  • 75
  • 2
  • 2
  • 8

2 Answers2

1

You are using json.loads when you should be using json.dumps.

Nathan Hinchey
  • 1,191
  • 9
  • 30
1

I fixed this by changing to python 3.

Michael Blair
  • 75
  • 2
  • 2
  • 8