3

I am testing my php code to try and see which coding methods result in quickest execution. I want to eventually test all of my php code which needs testing to try and create the most rapidly-executed code that I can. I am starting with a simple example on my home page, using microtime() to record the time before an INSERT statement executes and the time after it executes, and then echoing the difference as follows:

//Lots of code
$microtime1 = microtime();
$sql1=("INSERT INTO Table (value1,value2,value3,value4,value5,value6,value7) VALUES
('$value1','$value2', '$value3', '$value4', '$value5', '$value6', 'value7' )");

if (mysqli_query($sql1)) {
    $microtime2 = microtime();
    $Difference = $microtime2 - $microtime1;
    echo "<SCRIPT>
        alert('$Difference');
        location = 'home.php';
        </SCRIPT>";
} else {
    $message = 'The site is having some technical difficulties. Please try again!";
    echo "<SCRIPT>
        alert('$message');
        location = 'home.php';
        </SCRIPT>";
}
//More code

Over 10 trials, I initiated this query by inputting the same 3 letter string ('ddd') into the same input text box and clicking the same button (with no other users logged besides me), recording $Difference each time. I am astonished at how much variance there is in the data. For these 10 trials, the standard deviation in $Difference was 40% of the mean value of $Difference. Is there a better way to do this that I am not aware of? Or will I have to do 10-20 trials for each function to get a usable mean value of $Difference for my other code?

  • what are you *actually* trying to accomplish? The best way to reduce query times is by reducing the number of queries, optimizing tables with indexes and optimizing queries to use these indexes. That the queries vary could be due to many things, but I doubt a single insert is anything to worry about. If you are performing thousands of inserts, you would be better to try to combine them into one query. Also, using prepared statements is always advisable, especially when repeating a similar query multiple times with different values. – Mike Dec 07 '13 at 01:30
  • 3
    To answer your question, microtime is accurate. It is much more likely that the query is actually varying 40% from the mean. – Mike Dec 07 '13 at 01:34
  • In addition to reducing number of queries and optimizing queries, I am also trying to write the most rapidly-executing code. Consider it curiosity since I was formerly an experimental chemist :) I am relatively new to coding, but I have a hard time believing that all algorithms are created equally. For example, to check if a string is a member of an array, I can use the function in_array or I can use a foreach loop coupled with a comparison operator, comparing each member of the array to the string to see if there is an exact match. I would like a number to tell me which method is faster! – The One and Only ChemistryBlob Dec 07 '13 at 01:37
  • OK....that's what I think too. The variation must be from the query. – The One and Only ChemistryBlob Dec 07 '13 at 01:38
  • 1
    Of course in most cases it's best to use the fastest techniques possible, but you're dealing with millionths of a second in the vast majority of cases when trying to micro-optimize your code (DB queries excluded). In other words, it's not worth it. Whether you use single or double quotes for strings, whether you use strpos or some sort of regex really doesn't matter. Use the best tool for the job for readability, maintainability and functionality. Anything else is really just a waste of time. – Mike Dec 07 '13 at 01:44
  • 2
    Of course, if you are just doing it for curiosity's sake, knock yourself out! – Mike Dec 07 '13 at 01:44
  • Please learn to indent your code. – AStopher Nov 25 '14 at 15:54

1 Answers1

3

What you are talking about is profiling. Although it is possible to "roll your own", (and you are on the right track in that regard), there are a lot of helpful things out there that will do the heavy lifting for you.

If you want, you can always do what you suggested and smooth out the deviations by looping through multiple iterations of your tests, but if you want comprehensive profiling of all of your PHP I recommend the "xdebug" answer to this question: Simplest way to profile a PHP script

It worked well for me.

Community
  • 1
  • 1