I code web apps in a very small scale and most of the time for personal use only, so database sizes and query times never have been issues for me.
I use InnoDB database types as I am used to use constraints (ON DELETE CASADE) because I exclude many information in many tables and join them later.
Now I created a web application where I import data through CSV files once a week. The file size sometimes is bigger than 1 MB with more than 50.000 rows.
The CSV files are coming from an old application that has no API. The only way to access the data is to export it to CSV and import it to my app.
I just upload the CSV to the server through php and add the data row by row to my MySQL database with php. In detail I am going through every row of the file and do a "INSERT INTO" query. Thus I am running into big loading times for creating the rows and into to reaching the database memory limit (default value). Increasing the limit feels like doing something wrong and ends in long query times as well.
What can I do to improve the performance of my database?
Edit: It can't directly load the CSV to MySQL because I need to split the content in different tables.
E.g.: The CSV has a column with types
. I created a types
table and add the values to that table and within the main table I just add the id. Thus I have more flexibility in php later, because I don't need to handle value names. I just can work with ids.
Additionally I do something like "if number starts with XY, than add value Z to a specific column in the mysql table"