2

I am retrieving 10K records (having 10 columns with datatypes as datetime, text, int, varchar) using mysql_fetch_object and fetching all columns from each joined table in the query , which gave the memory size exhaustion fatal error, but then I amended the query to get only some specific columns which eliminated the error, saving me from drowning in the lake of errors.

The query is simply joining 3 tables, A, B, C as , SELECT * FROM A JOIN B ON ......JOIN C ON ...... WHERE.....

But I am still in the fear that if the no. of records or no. of columns increase, then the error would appear again....so what would be the complete solution of this? I read other posts on SO, and found that mysql_unbuffered_query should be used. But is there no solution if I use mysql_fetch_object? Also will unbuffered query take more time as it returns one row at a time?

Halcyon
  • 57,230
  • 10
  • 89
  • 128
sqlchild
  • 8,754
  • 28
  • 105
  • 167
  • can you give us the **actual** code? if you want to optimize it, we need something to optimize. btw, this might be better on codereview.stackexchange.com. – PlantTheIdea Jan 07 '14 at 14:53
  • show the actual query. You could very well be producing a cartesian join, so your 10k records balloon into billions or trillions of rows. – Marc B Jan 07 '14 at 14:53
  • no i executed the query in mysql and it literally returned 10000 rows – sqlchild Jan 07 '14 at 14:54
  • Do you have any LONGBLOB or LONGTEXT columns in your tables? PHP will try to pre-allocate enough memory to hold the largest possible string from those columns, not the length of the actual data. Which means it could be trying to allocate 4GB buffers, much in excess of PHP's default memory limit. Solution: use MEDIUMBLOB/MEDIUMTEXT instead. – Bill Karwin Jan 07 '14 at 14:54
  • @BillKarwin : am using the TEXT datatype, and the user can input any length input for that field so i can't limit the size of that – sqlchild Jan 07 '14 at 14:56
  • Are you sure that you need `*` on your query? – Ilia Ross Jan 07 '14 at 14:59
  • @sqlchild, TEXT cannot be longer than 64KB. See http://dev.mysql.com/doc/refman/5.6/en/storage-requirements.html – Bill Karwin Jan 07 '14 at 14:59
  • @IliaRostovtsev : yes am sure about that – sqlchild Jan 07 '14 at 15:18
  • @BillKarwin : but if 10K rows are fetched, does this overuse the memory – sqlchild Jan 07 '14 at 15:19
  • @sqlchild, yes, if you are expecting large result sets, you should take care not to try to store it all in memory at the same time. That's the reason to use unbuffered queries so the client doesn't prefetch all the rows. Also, design your application code to process the rows one at a time instead of saving them in one big array in PHP space. – Bill Karwin Jan 07 '14 at 17:02

1 Answers1

2

MySQL is not the one running out of memory here, it's your PHP script so reducing the size of the SELECT is only a partial solution.

My guess is that you're doing something like:

$collect = array();
while (has_records) {
    $collect[] = fetch_record;
}
foreach ($collect as $row) {
    handle_record($row);
}

Of course $collect will get very big. See if you can avoid collecting all records: Read a record, handle it, clean up memory and do the next iteration.

while (has_records) {
    var $row = fetch_record();
    handle_record($row);
}

Iterator sample code:

class DB {
    function get_iterator($query) {
        $query_result = do_query($query);
        return new Iterator($query_result);
    }
}

class MyIterator {
    public function __construct($mysql_result) {
        $this->res = $mysql_result;
    }
    public function hasMore() {
        // do some counting
    }
    public function fetch() {
        return mysql_fetch($this->res);
    }
}

echo '<table>';
$it = $db->get_iterator("SELECT * FROM `items`");
while($it->hasMore()) {
    echo '<tr>';
    $row = $it->fetch();
    foreach ($row as $cell) {
        echo '<td>' . $cell . '</td>';
    }
    echo '</tr>';
}
echo '</table>';

You can even implements Iterator so you can foreach over $it directly, which is awesome.

Halcyon
  • 57,230
  • 10
  • 89
  • 128
  • yes you are absolutely right, am using the method you posted – sqlchild Jan 07 '14 at 14:58
  • Ok, does this help you get to a solution? – Halcyon Jan 07 '14 at 15:01
  • This is actually a good idea. You could also split the queries to do them in batches of say 1k each. – Anyone Jan 07 '14 at 15:01
  • @FritsvanCampen : am displaying the records in a table layout after fetching them into a single array, and the while loop is in the back end , acc. to your answer i have to add the UI part in the back end which I can't do – sqlchild Jan 07 '14 at 15:15
  • 1
    Yes you can, you're just not thinking in the right way. Rather than `$db->getMeData("rawr")` and it gives you the data (in an array) you can have it return an Iterator. I added some sample code. – Halcyon Jan 07 '14 at 15:17
  • @FritsvanCampen : sir, can you please post the front-end code too with a simple query : SELECT * FROM items , sorry for being a newbie – sqlchild Jan 07 '14 at 15:29
  • @FritsvanCampen : ok, so first i need to do this : $it = new DB(); then $it = new MyIterator() – sqlchild Jan 07 '14 at 15:37
  • 1
    If you're unfamiliar with classes this solution might be a little bit too advanced. Go read this :) http://php.net/manual/en/language.oop5.basic.php – Halcyon Jan 07 '14 at 15:39
  • sir, let us [continue this discussion in chat](http://chat.stackoverflow.com/rooms/44718/discussion-between-sqlchild-and-frits-van-campen) – sqlchild Jan 07 '14 at 15:45