1

I have appointment script in php + mysql.

in the script I have 6 column and every column have 64 row that is equal to 384 field. For every field I have one mysql_query connection, that mades 384 connections to MySQL. is there any way to make this script more effective and faster??? Also i want to add that this script run on Xeon 3.4ghz 4 core 8gb ram server and i have delay for this page about 20-25 sec. but on another pages where I dont use loop I have very fast results even if i list few hundreds of row.

This is my code:

$listebolum=mysql_query("SELECT * FROM bolum WHERE randevu='ok' AND sube='".$_SESSION[ksube]."' ORDER BY id ASC");
while($listeboluml=mysql_fetch_array($listebolum)) {
    $basla=$danas;
    echo "<div style=\"position:relative; width:".$sirina."%; float:left; border-right:solid 1px #9a4b9d;\">";
    for($ra=$danas; $ra<$danasson; $ra=($ra+900)) {

        $uzmirandevu=mysql_query("SELECT randevu.id AS rid, randevu.bitis AS rbitis, randevu.baslama AS rbaslama, randevu.notum AS rnotum, randevu.hizmetler AS rhizmetler, musteri.ad AS mad, musteri.soyad AS msoyad FROM randevu LEFT JOIN musteri ON randevu.musteri=musteri.id WHERE randevu.baslama='".$basla."' AND randevu.sube='".$_SESSION[ksube]."' AND randevu.bolum='".$listeboluml[id]."'");
        $uzmirandevul=mysql_fetch_array($uzmirandevu);
        $yukseklik=(((($uzmirandevul[rbitis]-$uzmirandevul[rbaslama])/900)*26)-1);
        echo "some data from databse"; 
        $basla=$uzmirandevul[rbitis];
    }
}
echo "</div>";
}

MySQL structure:

`randevu` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`baslama` int(10) unsigned NOT NULL,
`bitis` int(10) unsigned NOT NULL,
`musteri` int(10) unsigned NOT NULL,
`personel` smallint(5) unsigned NOT NULL,
`notum` varchar(512) COLLATE utf8_unicode_ci NOT NULL,
`durum` char(2) COLLATE utf8_unicode_ci NOT NULL,
`sube` smallint(4) unsigned NOT NULL,
`bolum` smallint(4) unsigned NOT NULL,
`hizmetler` varchar(256) COLLATE utf8_unicode_ci NOT NULL,
`zaman` int(10) unsigned NOT NULL,
`rgun` tinyint(2) NOT NULL,
`ray` tinyint(2) NOT NULL,
`ryil` smallint(4) NOT NULL,
`guncelleme` int(10) NOT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci AUTO_INCREMENT=1 ;
Barmar
  • 741,623
  • 53
  • 500
  • 612
mandza
  • 330
  • 9
  • 24
  • Can you show us how your database structure looks like? – sulmanpucit Dec 09 '13 at 15:35
  • @sulmanpucit of course – mandza Dec 09 '13 at 15:36
  • `$yukseklik = ($uzmirandevul['rbitis']-$uzmirandevul['rbaslama'])/900*26-1;` – webbiedave Dec 09 '13 at 15:36
  • 2
    Depending on how many results you're looping through, you could be running alot of additional queries. Can't you use `JOIN` to pull the additional data? – Ryan Dec 09 '13 at 15:37
  • 3
    You shouldn't be using mysql_fetch_array and mysql_query as these are deprecated. You should use MySQLi or PDO. See here: http://www.php.net/manual/en/function.mysql-db-query.php – Tom Dec 09 '13 at 15:39
  • @Stanyer as I say i have 6 column and in every column this code is runed 64 times – mandza Dec 09 '13 at 15:44
  • PDO is just an API - it does not increase performance. – tyteen4a03 Dec 09 '13 at 15:44
  • You should look at using a JOIN. You'll be able to pull the additional data out with that. http://dev.mysql.com/doc/refman/5.0/en/join.html – Ryan Dec 09 '13 at 15:45
  • @TimothyChoi is there any other approach that I can use insted of making request for every row separately??? – mandza Dec 09 '13 at 15:48
  • Why are you doing the same query in the inner loop, when the parameters don't change? What is `$ra` used for? – Barmar Dec 09 '13 at 16:25
  • @Barmar parameters are changed $basla=$uzmirandevul[rbitis]; is the code that change what data will be called on the next request. – mandza Dec 09 '13 at 16:29

1 Answers1

3

Because your code isn't written in English it is a bit hard for me to understand it, but I advise you to read a bit about the N+1 problem. You can probably remove the second query from the loop in order to cut down your SQL queries. This is called eager loading.

Here is an example:

SELECT * FROM authors // first query

foreach ($authors as $author) {
    SELECT * FROM books WHERE author_id = $author->id // second query
}

With 10 authors, this runs just fine. But with 100 authors, you are running 101 queries: 1 to get the list of authors, and then 1 for every author. In other words, more authors means more queries and therefor a longer loading time.

You can speed things up by removing the second query from the foreach-loop:

SELECT * FROM authors // first query

$author_ids = array();
foreach ($authors as $author) {
    $author_ids[] = $author->id; // add author id to array
}

$author_ids = implode(',', $author_ids); // create a list of comma-seperated ids

SELECT * FROM books WHERE author_id IN ($author_ids) // second query

foreach ($authors as $author) {
    // link books to authors
}

This way you use one query to retrieve all the books, and then you use PHP to link books to the right authors.

It is a bit more complicated, but it reduces the number of SQL queries to 2. This is obviously still very simplistic, but it should give you an idea of the concept.

Community
  • 1
  • 1
Nic Wortel
  • 11,155
  • 6
  • 60
  • 79
  • I've added an example. It's not at all ready for use but it should give you an idea of the concept. – Nic Wortel Dec 09 '13 at 16:04
  • 2
    By the way, this solution is called 'eager loading'. – Nic Wortel Dec 09 '13 at 16:21
  • For further readers this is usefull link on what 'eager loading' is: http://stackoverflow.com/questions/1299374/what-is-eager-loading – mandza Dec 09 '13 at 16:25
  • How would you apply this answer to his case? As he explained in his reply to my comment, each iteration of the inner query is dependent on a value returned by the previous query. He isn't just getting a row for each value returned by the first query. – Barmar Dec 09 '13 at 16:41