1

I have a DB table which has approximately 40 columns and the main motive is to insert the records in the database as quickly as possible. I am using PHP for this.

The problems is, to create the insert statement, I have to loop through a for each. I am not sure if I doning this correctly. Please suggest me the best atlernative.. here is the example..

/// to loop through the available data ///

$sqc = "";
for ($i=1; $i<=100; $i++){
  if ($sqc == ""){
     $sqc = "('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
  } else {
     $sqc .= ",('".$array_value["col1"]."'.. till .. '".$array_value["col40"]."',)";
  }
}

/// finally the sql query ///

$sql_quyery = "INSERT INTO  table_name (`col1`,.. till.. ,`col40`) values ".$sqc;

This concatenation of $sqc is taking a lot of time. and also the insertion in the DB, is there an alternate way of doing this.. i need to find a way to speed this up like 100X.. :(

Thank you

Amit Shah
  • 35
  • 1
  • 6
  • 1
    Buy a faster computer? What I mean is that in the information you have given us contains no clue as to how things could be made faster. We have no idea why you do this, why you're in a hurry, what the data is, etc, etc. Your question is too abstract and lacking information. – KIKO Software Mar 21 '17 at 07:15
  • How you access to db msqli,? PDO? – ScaisEdge Mar 21 '17 at 07:19
  • Have you done research on the engine used to for the table? For example, if InnoDB is used? Also possible duplicate http://stackoverflow.com/questions/9819271/why-is-mysql-innodb-insert-so-slow – Balazs Vago Mar 21 '17 at 07:21
  • @scaisEdge - using msqli - codeigniter – Amit Shah Mar 21 '17 at 07:48
  • @BalazsVago yes InnoDB is used.. no there are no duplicates.. there are like fresh 100 - 200 records with 40 cols each.. – Amit Shah Mar 21 '17 at 07:50
  • 1
    Why you are looping for 100 time .. ? – ScaisEdge Mar 21 '17 at 07:50
  • I doubt the loop is the bottleneck here. And why are you inserting the same data 100 times? I'd have a lot of suggestions to give you on how to speed up your database aswell as how to workaround any performance hard limit, but it looks like you are omitting something here. Perhaps you should clarify your problem so that we know exactly what you need. – Havenard Mar 21 '17 at 08:12
  • You could use async calls for instance so the script don't have to wait on the `INSERT` to succeed before it can do other stuff, or if it's some sort of cronjob running periodically in background you can output the queries to a MySQL command line client instead of running the queries itself. – Havenard Mar 21 '17 at 08:17

2 Answers2

1

As suggested on MySQL Optimizing INSERT Statements page, there are some ways for this-

  • If you are inserting many rows from the same client at the same time, use INSERT statements with multiple VALUES lists to insert several rows at a time. This is considerably faster (many times faster in some cases) than using separate single-row INSERT statements. If you are adding data to a nonempty table, you can tune the bulk_insert_buffer_size variable to make data insertion even faster.

  • When loading a table from a text file, use LOAD DATA INFILE. This is usually 20 times faster than using INSERT statements.

Find the link below- [MySQL Guide] [1] https://dev.mysql.com/doc/refman/5.7/en/insert-optimization.html

Ravi Karn
  • 23
  • 5
0

Is just a small contribute but you can avoid the concat using a binding

$stmt = mysqli_prepare($yuor_conn, 
        "INSERT INTO  table_name (`col1`,.. till.. ,`col40`) VALUES (?, ... till..  ?)");

mysqli_stmt_bind_param($stmt, 'ss.......s',
       $array_value["col1"], $array_value["col2"], .. till.., 
                  $array_value["col40"]);
ScaisEdge
  • 131,976
  • 10
  • 91
  • 107