7

I'm having some serious problems with the PHP Data Object functions. I'm trying to loop through a sizeable result set (~60k rows, ~1gig) using a buffered query to avoid fetching the whole set.

No matter what I do, the script just hangs on the PDO::query() - it seems the query is running unbuffered (why else would the change in result set size 'fix' the issue?). Here is my code to reproduce the problem:

<?php
$Database = new PDO(
    'mysql:host=localhost;port=3306;dbname=mydatabase',
    'root',
    '',
    array(
        PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION,
        PDO::MYSQL_ATTR_USE_BUFFERED_QUERY => true
    )
);

$rQuery = $Database->query('SELECT id FROM mytable');

// This is never reached because the result set is too large
echo 'Made it through.';

foreach($rQuery as $aRow) {
    print_r($aRow);
}
?>

If I limit the query with some reasonable number, it works fine:

$rQuery = $Database->query('SELECT id FROM mytable LIMIT 10');

I have tried playing with PDO::MYSQL_ATTR_MAX_BUFFER_SIZE and using the PDO::prepare() and PDO::execute() as well (though there are no parameters in the above query), both to no avail. Any help would be appreciated.

Stewart
  • 127
  • 2
  • 6

3 Answers3

9

If I understand this right, buffered queries involve telling PHP that you want to wait for the entire result set before you begin processing. Prior to PDO, this was the default and you had to call mysql_unbuffered_query if you wanted to deal with results immediately.

Why this isn't explained on the PDO MySQL driver page, I don't know.

Powerlord
  • 87,612
  • 17
  • 125
  • 175
  • Wow okay I'm an idiot. Don't know what gave me the opposite impression. – Stewart Feb 23 '09 at 19:52
  • Technically, a "buffered" query means the MySQL client library pulls the whole resultset off the TCP stream before handing it back to you. – staticsan Feb 24 '09 at 06:13
  • Hmm, I thought the manual covered the difference between buffered/unbuffered (mysql style) and fetch/fetchAll (PDO style), but looking again it does not. If you want some more background info, you may find the following useful: http://netevil.org/blog/2008/06/slides-pdo – Wez Furlong Aug 22 '09 at 00:41
  • The revised link to those slides is: http://wezfurlong.org/blog/2008/jun/slides-pdo/ – Wez Furlong Dec 14 '11 at 00:58
2

You could try to split it up into chunks that aren't big enough to cause problems:

<?php    
$id = 0;
$rQuery = $Database->query('SELECT id FROM mytable ORDER BY id ASC LIMIT 100');

do {
    stuff($rQuery);
    $id += 100;
} while ( $rQuery = $Database->query(
            'SELECT id FROM mytable ORDER BY id ASC LIMIT 100 OFFSET '.$id
          )
        );
?>

...you get the idea, anyway.

  • 1
    Note that this approach works only for queries on tables that aren't changing - any insert or delete operation can cause you to miss a record or read it twice! – mindplay.dk Jan 24 '19 at 08:14
-1

Or maybe you could try mysql functions instead:

while ($row = mysql_fetch_row($query)) {
...
}

Which will definitely be faster, since that foreach statement makes an impression to use fetchAll() instead fetch() each row

Josh Darnell
  • 11,304
  • 9
  • 38
  • 66
Martin
  • 1