I'm trying to write a python script that parses through a file and updates a database with the new values obtained from the parsed file. My code looks like this:
startTime = datetime.now()
db = <Get DB Handle>
counter = 0
with open('CSV_FILE.csv') as csv_file:
data = csv_file.read().splitlines()
for line in data:
data1 = line.split(',')
execute_string = "update table1 set col1=" + data1[1] +
" where col0 is '" + data1[0] + "'"
db.execute(execute_string)
counter = counter+1
if(counter % 1000 == 0 and counter != 0):
print ".",
print ""
print datetime.now() - startTime
But that operation took about 10 mins to finish. Any way I can tweak my SQL query to quicken it?