0

I am trying to populate a MySQL database table from a CSV file using PyMySQL. The CSV file has approx 948,000 rows. The script works fine but only approximately 840,000 rows appear in the database, I don't know where the rest go.

I am guessing this has something to do with connection.commit() method so I have tried committing at the end of the script as well as after every 10,000 INSERT queries but nothing works so far. Any ideas what I might be doing wrong?

I have attached the relevant code snippet below:

    with gzip.open(temp_file_path, "rt", encoding="utf-8") as f:
        reader = csv.reader(f)
        for row in reader:
            # if num % 10000 == 0:
                # conn.commit()
            print("[+] Processing row: ", num)

            sql = "INSERT INTO `{0}`({1}) VALUES({2})".format(table_name, ", ".join(columns),
                                                              ", ".join(["%s"] * len(columns)))
            result = cursor.execute(sql, row)
            if result == 1:
                num += 1
            else:
                print("Not inserted!")
    conn.commit()
    cursor.close()
    conn.close()

0 Answers0