I have a large result set returned in my query. For some reason, when I loop through the results, doing what I need with each row, I'm finding that the script is ending early.
$stmt1 = $mysqli1->prepare("select distinct p.tn, n.model, n.software_version , c.card_part_num
from N_DB.DSLAMS n
join N_DB.dslam_card on n.nodealias = c.nodealias
join N_DB.dslam_port p on c.index = p.dslam_card_index
WHERE ping_reply = '1'
order by p.tn
LIMIT 60000000");
//it's less than 60 million, but I set that limit to make sure my issue wasn't that I was setting the limit too low
if(!$stmt1->bind_result($num, $model, $software_version, $card))
{
echo "Binding results failed: (" . $stmt1->errno . ") " . $stmt1->error;
}
print "will query depo next \n";
if(!$stmt1->execute())
{
$tempErr = "Error querying depo db: " . $stmt1->error;
printf($tempErr . "\n"); //show mysql execute error if exists
$err->logThis($tempErr);
}
print "done querying depo \n";
$total_records = $stmt1->num_rows();
print "Total records to process: ".$total_records." \n"; //says 0, but it clearly has many
// when it loops thru them,
//printing things
print " Please wait. \n";
$count = 0;
while($stmt1->fetch()) {
$count ++;
if (($count % 50000) == 0){
print "Working on record ".$count." \n"; //I never see this before it prints done
}
//do other things that are more interesting
}//while
print "\n done looping through tn's "; //it gets to this too soon
I think I'm processing too many records, but there isn't a logical way to cut the results to be less than 500000, so the only solution is to divide up the results. I've been looking at taking forever, and limit..peppy comments. I'd like to divide up the results, but I'm having trouble understanding what these people are doing.
How would I divide up my results into chunks so I don't, presumably, run out of memory. I'm relatively new to MySQL, so I apologize if I'm a little green.
Oh, and I checked max_execution_time => 0 => 0 , so that looks fine.