0

I code web apps in a very small scale and most of the time for personal use only, so database sizes and query times never have been issues for me.

I use InnoDB database types as I am used to use constraints (ON DELETE CASADE) because I exclude many information in many tables and join them later.

Now I created a web application where I import data through CSV files once a week. The file size sometimes is bigger than 1 MB with more than 50.000 rows.

The CSV files are coming from an old application that has no API. The only way to access the data is to export it to CSV and import it to my app.

I just upload the CSV to the server through php and add the data row by row to my MySQL database with php. In detail I am going through every row of the file and do a "INSERT INTO" query. Thus I am running into big loading times for creating the rows and into to reaching the database memory limit (default value). Increasing the limit feels like doing something wrong and ends in long query times as well.

What can I do to improve the performance of my database?

Edit: It can't directly load the CSV to MySQL because I need to split the content in different tables. E.g.: The CSV has a column with types. I created a types table and add the values to that table and within the main table I just add the id. Thus I have more flexibility in php later, because I don't need to handle value names. I just can work with ids. Additionally I do something like "if number starts with XY, than add value Z to a specific column in the mysql table"

T. Karter
  • 638
  • 7
  • 25

2 Answers2

1

If you have data in a file, use load data infile. This will be much faster than inserting the data one row at a time. Do note that the file needs to be accessible on the server where MySQL is running. This should not be an issue on a personal application (which is presumably running on a single computer).

I might also recommend loading the data into a staging table. Then inserting from that table into the final table. This makes it much simpler to handle any glitches in the incoming data.

Gordon Linoff
  • 1,242,037
  • 58
  • 646
  • 786
0

If you need processing data in your php applciation before inserting. You can assemble one INSERT statement with multiple rows which is much faster in MySQL than one INSERT statement per row.

$data = array(
   array(
      'title' => 'My title' ,
      'name' => 'My Name' ,
      'date' => 'My date'
   ),
   array(
      'title' => 'Another title' ,
      'name' => 'Another Name' ,
      'date' => 'Another date'
   )
);

$this->db->insert_batch('mytable', $data);

// Produces: INSERT INTO mytable (title, name, date) VALUES ('My title', 'My name', 'My date'), ('Another title', 'Another name', 'Another date')
spike 王建
  • 1,556
  • 5
  • 14