When I execute the following code for a user table of about 60,000 records:
mysql_connect("localhost", "root", "");
mysql_select_db("test");
$result = mysql_query("select * from users");
while ($row = mysql_fetch_object($result)) {
echo(convert(memory_get_usage(true))."\n");
}
function convert($size) {
$unit=array('b','kb','mb','gb','tb','pb');
return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i];
}
I get the following error:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
Any thoughts on how to avoid having the script take up additional memory with each pass through the loop? In my actual code I'm trying to provide a CSV download for a large dataset, with a little PHP pre-processing.
Please don't recommend increasing PHP's memory limit--it's a bad idea and, more importantly, will still create an upward bound on how large a dataset can be processed with this technique.