1

I have two versions of code to do the same thing. One uses prepared statements, the other concatenated strings. My understanding is that the prepared statements should result in a performance increase, but after setting $size=100 (therefore iterating the code for 10000 insert queries), I haven't been able to detect any difference in performance between the two methods. Each runs in appx. 133 seconds. Am I implementing the prepared statement code incorrectly? Code follows:

Prepared Statements:

if ($stmt = $mysqli->prepare("INSERT INTO sectors (sector) VALUES (?)"))
{
    for ($x = 0; $x < $size; ++$x)
    {
        for ($y = 0; $y < $size; ++$y)
        {
            $dirtysector = $x . "-" . $y;
            $secstring = clean_sector($dirtysector);
            $stmt->bind_param('s', $secstring);
            $stmt->execute();
         }
}
$stmt->close();

Concatenated Strings:

for ($x = 0; $x < $size; ++$x)
{
    for ($y = 0; $y < $size; ++$y)
    {
        $dirtysector = $x . "-" . $y;
        $secstring = clean_sector($dirtysector);
        $query = "INSERT INTO sectors " . 
                 "(sector) VALUES " .
                 "('$secstring')";
        $result = $mysqli->query($query);
             if(!$result) die("Sector Creation Failed: " . $mysqli->error);         
     }
 }
Michael.Lumley
  • 2,345
  • 2
  • 31
  • 53
  • 2
    How are you measuring those *performances* with a tiny 1000 iterations simple loop? And besides this, using prepared statements in PHP is usually a matter of security, not performance. – Eggplant Sep 06 '13 at 18:14
  • 1
    So you're saying that 1000 iterations isn't enough to achieve a detectable difference? Each version clocks in at about 133 seconds. I assumed that should be long enough to see some performance increase, but maybe I am wrong about that. – Michael.Lumley Sep 06 '13 at 18:16
  • So you are basically measuring the performance of that `clean_sector()` function this way? I'm no benchmark expert, but that code doesn't look like the best to measure things. Use some epoch/microtime instead. – Eggplant Sep 06 '13 at 18:19
  • No. The clean_sector() function is irrelevant here, it just cleans up the string that's going into the mysql. For my test, I set the variable $size=100, therefore the code runs 1000 INSERT queries, and I'm interested in the performance of that query. – Michael.Lumley Sep 06 '13 at 18:25
  • If it's performance what you are really looking for, then you should wrap everything in a single transaction, and then you will definitely notice some changes. – Eggplant Sep 06 '13 at 18:33
  • So do you mean that the execute statement should be run outside the for loop, after all the variables are bound? I'm not sure what you mean by "wrap everything in a single transaction." – Michael.Lumley Sep 06 '13 at 18:37
  • Never mind. After doing some more research, I assume you are referring to this: http://stackoverflow.com/questions/1793169/which-is-faster-multiple-single-inserts-or-one-multiple-row-insert – Michael.Lumley Sep 06 '13 at 18:41
  • Just add `$mysqli->beginTransaction();` at the beginning of your code (the PDO version) and `$mysqli->commit();` at the end of it. This will boost your `INSERT` statements. After your edit: that is a way to boost performance, but having many `INSERTs` in a single transaction is *almost* as good. – Eggplant Sep 06 '13 at 18:42

3 Answers3

1

I would use some sort of time functionality to determine the exact speed of your transactions.

Such as:

$time = -microtime(true);
// run transactions
$time += microtime(true);
echo $time . ' in seconds to process...';

Also, prepared statements are not necessarily created to increase performance. If anything I would assume they would slow things down. Prepared statements are to 'prepare' statements. Meaning they process the text, quote any parameters to protect against SQL Injection and make sure that the SQL statement does not have any syntax errors. That overhead will not provide performance boosts.

Rottingham
  • 2,593
  • 1
  • 12
  • 14
  • 1
    My understanding is that prepared statements result in better performance because the query does not need to be reevaluated each time it is called. http://mattbango.com/notebook/code/prepared-statements-in-php-and-mysqli/ – Michael.Lumley Sep 06 '13 at 18:23
  • @user2102695 your understanding is correct, but not your testing approach – samayo Sep 06 '13 at 18:25
  • Also, this is exactly what I did. When $size = 100 (therefore there are 1000 INSERT queries) the script takes about 133 seconds to run under either approach. If what I read is true, the prepared statements should be substantially faster. I don't need to worry about SQL injection here, because there is no user input in this script. – Michael.Lumley Sep 06 '13 at 18:27
  • @Simon_eQ - what is wrong with the approach? I'm measuring the time the script takes to run in a separate function. – Michael.Lumley Sep 06 '13 at 18:28
  • In `INSERT` statements the performance benefits of prepared statements are not really related to the cached execution plan, but rather to *less data travelling to the server each time* since only the parameter values are passed. This becomes even less noticeable if the database is located on the same server where the script is executed. A `SELECT` query would certainly give you more clear results, but again, you should isolate the database interaction to properly test the performance. – Eggplant Sep 06 '13 at 18:31
  • If the bottleneck is your database, the query overhead might not be significant enough to matter. You should use placeholders not because they're faster, but because they're easier to create and be assured there's no SQL injection bugs. – tadman Sep 06 '13 at 19:37
0

Because it's a myth.

As a matter of fact, such a "performance increase" is just a fairy tale untended to lure users into hands of prepared statements.

However

133 seconds is a way too much time for 10000 inserts. Nearly 10000 times slower that it should be.

It's innodb engine with default settings that slows your inserts down. You may change the settings or - as you have been told - wrap your inserts into single transaction. You will have all your inserts in a fraction of second.

Community
  • 1
  • 1
Your Common Sense
  • 156,878
  • 40
  • 214
  • 345
-1

Preparing the statements in advance did not result in a large increase in performance because the the performance gains result primarily from network travel time. As my database is stored on the same machine on which my script is executed, there was little performance benefit to the prepared statements.

However, when inserting multiple values with a single query, I saw more than a 100X increase in performance. This is a much better solution to my performance problems.

See: http://www.electrictoolbox.com/mysql-insert-multiple-records/

Michael.Lumley
  • 2,345
  • 2
  • 31
  • 53
  • 1. Network travel has nothing to do with prepared statements. You can have 100X with **any** method if use your database properly. An answer consists of a false statement and a hyperlink cannot be cnsidered as a good one. – Your Common Sense Sep 08 '13 at 12:09
  • I was simply reposting Eggplant's comment as an answer. – Michael.Lumley Sep 08 '13 at 18:49