1

I have an array named prices which has 1000 elements, I used this code to add them to database:

for (int i = 0; i < 1000; i++) {
  $query  = "INSERT INTO TABLE data (amount) VALUES";
  $query .= " (" . $prices[i] . ");";
  mysqli_query($link, $query);
}

Although it works, But it takes more than 15second to get done, What can I do to optimize it?

omidh
  • 2,526
  • 2
  • 20
  • 37
  • 4
    Use a prepared statement with a bind variable; you only need to prepare the statement once; and then bind and execute inside the loop – Mark Baker Dec 12 '15 at 21:08
  • 2
    ^ this. Read more: http://php.net/manual/en/mysqli.prepare.php – JimL Dec 12 '15 at 21:09
  • That can be one query, append the values with a comma. `(" . $prices[i] . "),`. Then `rtrim` the last `,`. – chris85 Dec 12 '15 at 21:14
  • Prepare an sql script holding the insert commands and load that instead of processing each entry in an additional language. An sql script is a simple text file holding sql commands. You can use any text editor to write it. – arkascha Dec 12 '15 at 21:16
  • Another points: _drop_ any indexes that exist on the table before inserting the entries, recreate the indexes afterwards. Otherwise the indexes have to be rearranged after each and every insert. – arkascha Dec 12 '15 at 21:18
  • Insert multiple items at once , ex: http://www.w3schools.com/php/php_mysql_insert_multiple.asp OR http://stackoverflow.com/questions/6889065/inserting-multiple-rows-in-mysql – ka_lin Dec 12 '15 at 21:27
  • @KA_lin: `multi_query` isn't always recommended (I'd even argue it's something best avoided unless you really know what you're doing), and it's really only useful if you're inserting into multiple tables. OP is inserting into a single table here – Elias Van Ootegem Dec 12 '15 at 21:34
  • @arkascha: That makes no sense... if you're not over-/under-indexing, inserting a thousand values into a table shouldn't be a problem. Dropping the index, only to add it after inserting is just silly. A single insert (or even 1000 inserts) should *not* trigger a table rebuild, dropping and then adding an index is far more likely to do that – Elias Van Ootegem Dec 12 '15 at 21:40
  • @EliasVanOotegem Well, just my experience and it appears to make sense to me. Certainly inserting a thousand values into a table is not a problem. I never said so. Dropping and re-adding the index is however standard for database imports. I obviously cannot say anything specific here, since indeed 1000 values is not a big deal. But then again that completely depends in the type(s) of index definition(s). – arkascha Dec 12 '15 at 21:45
  • @arkascha: An import on an inactive DB (as in a fixture or sorts): yes, adding the indexes later on is pretty standard. Do I care in case of a 1000 rows, not really. But as you said, and as I hinted at in my last comment: it all depends on the table/storage engine. If you're storing everything as VARCHAR, and 80% of your fields are (part of) an index on InnoDB (or worse: NDB), then God help you... ;-) – Elias Van Ootegem Dec 12 '15 at 21:49

2 Answers2

5

Technically speaking, you don't even need the loop. You can simply insert all the values in one single array:

$query = 'INSERT INTO data (amount) VALUES ';
$valueSets = array_fill(0, count($prices), '(?)');//creates an array containing a (?) string for each price
$pdo = new PDO($dsn, $user, $pass, $attr);
$query .= implode(', ', $valueSets);//concatenate onto query string
$stmt = $pdo->prepare($query);
$stmt->execute($prices);

That's all. Of course, that's not a very nice way to do things, so I'd probably use something like this:

$stmt = $pdo->prepare('INSERT INTO data (amount) VALUES (:price)');
foreach ($prices as $price) {
    $stmt->execute([':price' => $price]);
    $stmt->closeCursor();//optional
}

Or, if you really want to use Mysqli:

$stmt = $link->prepare('INSERT INTO data (amount) VALUES (?)');
$price = null;
$stmt->bind_param('i', $price);//bind by reference
foreach ($prices as $price) {
    $stmt->execute();//insert price
}

But honestly: this, to me looks like DB fixture related things. I'd simply put the prices in a CSV file or something, and then run a LOAD DATA LOCAL INFILE query instead of writing a script to insert all the values

Elias Van Ootegem
  • 74,482
  • 9
  • 111
  • 149
  • Well, My prices were originally in CSV! At first this option looks good but my csv file is a little crazy and I need to run preg_replace() on every cell before inserting them into database. – omidh Dec 12 '15 at 21:44
0

Making database request is a costly task. You should reduce the number of requests. You could for instance insert multiple values at once with a request like so.

INSERT INTO TABLE data (amount) VALUES ('v1'), ('v2'), ('v3')

Be aware that the settings of your database may limit the length of the query. So 10 requests of 100 insertions each or 20 requests of 50 insertions (20x50=1000) may do the job. And it will be a lot faster.

Mat
  • 2,134
  • 1
  • 18
  • 21