I need to insert a 60000x24 dataframe into a mysql database (MariaDB) using sqlalchemy and python. The database runs locally and the data insertion runs locally as well. For now I have been using the LOAD DATA INFILE sql query, but this requires the dataframe to be dumped into a CSV file, which takes about 1.5-2 seconds. The problem is I have to insert 40 or more of these dataframes, so the time is critical.
If I use df.to_sql then the problem gets much worse. The data insertion takes at least 7 (up to 30) seconds per dataframe.
The code I´m using is provided here below:
sql_query ="CREATE TABLE IF NOT EXISTS table(A FLOAT, B FLOAT, C FLOAT)"# 24 columns of type float
cursor.execute(sql_query)
data.to_sql("table", con=connection, if_exists="replace", chunksize=1000)
Which takes between 7 and 30 seconds to be executed. Using LOAD DATA, the code looks like:
sql_query = "CREATE TABLE IF NOT EXISTS table(A FLOAT, B FLOAT, C FLOAT)"# 24 columns of type float
cursor.execute(sql_query)
data.to_csv("/tmp/data.csv")
sql_query = "LOAD DATA LOW_PRIORITY INFILE '/tmp/data.csv' REPLACE INTO TABLE 'table' FIELDS TERMINATED BY ','; "
cursor.execute(sql_query)
This takes 1.5 to 2 seconds, mainly due to dumping the file to CSV. I could improve this last one a bit by using LOCK TABLES, but then no data is added into the database. So, my questions here is, is there any method to speed this process up, either by tweaking LOAD DATA or to_sql?
UPDATE: By using an alternative function to dump the dataframes into CSV files given by this answer What is the fastest way to output large DataFrame into a CSV file? I´m able to improve a little bit of the performance, but not that significantly. Best,