1

I have a script which inserts lot's of data. This data is mostly a replication of the previous insert, however at least one value is different. Therefore, I prepare the statement and bind the params to execute and repeat.

However, I tried to write this to a prepared batch insert inserting 1000 rows tops at a time. Unfortunately, I can't get it to work. The code I'm using right now is just inserting the entire data at once. However, this is not wanted, since the amount of data may be a lot bigger than tested with.

The code I use now (Bulk insert all at once):

$Data = array(
    array("1", "2", "3", "4"),
    array("1", "2", "3", "5"),
    array("1", "2", "3", "6"),
    array("1", "2", "3", "7"),
    array("1", "2", "3", "8"),
    //ETC
    );
//AS YOU CAN SEE, ONLY COL4 CHANGES
$sql = $mysqli->prepare("INSERT INTO `Table` (Col1, Col2, Col3, Col4) VALUES ".implode(', ', array_fill(0, count($Data), "(?, ?, ?, ?)")));
foreach ($Data as $InsertData) {
    $sql->bind_param('iiii', $InsertData[0], $InsertData[1], $InsertData[2], $InsertData[3]);
    }
$sql->execute();

What I want to achieve is that the data will be inserted with a prepared statement like above, but with a limit of 1000 (or any other number) per batch. I can't get this to work. I tried to use array_splice and other methods, but I couldn't get it to work.

TVA van Hesteren
  • 1,031
  • 3
  • 20
  • 47
  • Possible duplicate of [MySQLi : Inserting multiple rows with one prepared statement](http://stackoverflow.com/questions/14860606/mysqli-inserting-multiple-rows-with-one-prepared-statement) – JohannesAndersson May 01 '17 at 16:19

2 Answers2

5

You should consider using LOAD DATA INFILE.

It's likely to be far faster than using INSERT with batches of rows, even if you have to write out a temp file first.

$Data = array(
    array("1", "2", "3", "4"),
    array("1", "2", "3", "5"),
    array("1", "2", "3", "6"),
    array("1", "2", "3", "7"),
    array("1", "2", "3", "8"),
    //ETC
    );

$tempname = tempnam("/tmp", "data");
$fp = fopen($tempname, "w");
foreach ($Data as $fields) {
    fputcsv($fp, $fields);
}
fclose($fp);
if ($mysqli->query("
  LOAD DATA INFILE '$tempname' INTO TABLE `Table`
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
  ") === false) {
    error_log($mysqli->error);
}
unlink($tempname);

Read https://dev.mysql.com/doc/refman/5.7/en/load-data.html to make sure you understand about the LOCAL option and the local_infile and secure_file_priv config options.

Bill Karwin
  • 538,548
  • 86
  • 673
  • 828
4

Rather than trying to add any more copies of the array, you could use simple for loops to work in segments:

$arraySize = count($Data);
for ( $i = 0; $i < $arraySize; $i+=1000 )    {
  // Number of elements in this chunk
  $thisChunk = min(1000, $arraySize-$i);
  // Prepare your statement
  $sql = $mysqli->prepare("INSERT INTO `Table` (Col1, Col2, Col3, Col4) VALUES ".implode(', ', array_fill(0, $thisChunk, "(?, ?, ?, ?)")));
  for( $j = $i; $i < $i + $thisChunk; $j++ )  {
    // Bind this data from $Data[$j]
  }
}

This of course will re-prepare your statement each time, you could avoid this if you know that the thisChunk is 1000 elements long.

Nigel Ren
  • 56,122
  • 11
  • 43
  • 55
  • I have one problem. The binding returns `Number of variables doesn't match number of parameters in prepared statement`. This because in the `$Data[$j]` loop I bind 4 parameters at a time. How should I fix this issue? E.g. could you show the code how you expect me to bind the `$Data[$j]` params? – TVA van Hesteren May 01 '17 at 18:31
  • 1
    You would have to build up an array of values and use `call_user_func_array(array($stmt, 'bindparams'), $array_of_params);` to link them all to the statement. An alternative would be to move to PDO where you could link them one at a time. – Nigel Ren May 01 '17 at 19:45
  • Thanks PDO was the way to go! I am now able to bind params in a loop and execute the statement when finished. Thanks! – TVA van Hesteren May 01 '17 at 20:34