0

I have a large result set returned in my query. For some reason, when I loop through the results, doing what I need with each row, I'm finding that the script is ending early.

$stmt1 = $mysqli1->prepare("select distinct p.tn, n.model, n.software_version , c.card_part_num                             
                            from N_DB.DSLAMS n                              
                            join N_DB.dslam_card on n.nodealias = c.nodealias
                            join N_DB.dslam_port p on c.index = p.dslam_card_index                      
                            WHERE ping_reply = '1'
                            order by p.tn
                            LIMIT 60000000"); 
    //it's less than 60 million, but I set that limit to make sure my issue wasn't that I was setting the limit too low

if(!$stmt1->bind_result($num, $model, $software_version, $card)) 
{
    echo "Binding results failed: (" . $stmt1->errno . ") " . $stmt1->error;
}   
print "will query depo next \n";
if(!$stmt1->execute())
{ 
    $tempErr = "Error querying depo db: " . $stmt1->error;
    printf($tempErr . "\n");    //show mysql execute error if exists  
    $err->logThis($tempErr);
} 
print "done querying depo \n";
$total_records = $stmt1->num_rows();   
print "Total records to process: ".$total_records." \n"; //says 0, but it clearly has many
 // when it loops thru them, 
 //printing things
  print " Please wait. \n";

  $count = 0;

  while($stmt1->fetch()) {

      $count ++;
      if (($count % 50000) == 0){
          print "Working on record ".$count." \n"; //I never see this before it prints done
      }
      //do other things that are more interesting
  }//while
  print "\n done looping through tn's "; //it gets to this too soon

I think I'm processing too many records, but there isn't a logical way to cut the results to be less than 500000, so the only solution is to divide up the results. I've been looking at taking forever, and limit..peppy comments. I'd like to divide up the results, but I'm having trouble understanding what these people are doing.

How would I divide up my results into chunks so I don't, presumably, run out of memory. I'm relatively new to MySQL, so I apologize if I'm a little green.

Oh, and I checked max_execution_time => 0 => 0 , so that looks fine.

revo
  • 47,783
  • 14
  • 74
  • 117
Michele
  • 3,617
  • 12
  • 47
  • 81
  • 1
    Add `ini_set('display_errors', 1); ini_set('log_errors',1); error_reporting(E_ALL); mysqli_report(MYSQLI_REPORT_ERROR | MYSQLI_REPORT_STRICT);` to the top of your script. This will force any `mysqli_` errors to generate an Exception that you can see on the browser and other errors will also be visible on your browser. – RiggsFolly Jun 21 '18 at 14:18
  • `ini_set('output_buffering',0)` to get prints before end of file. use `flush()` after print. or `ini_set('implicit_flush',1)` instead of `flush()` – SirPilan Jun 21 '18 at 14:19
  • I'm seeing Fatal error: Uncaught exception 'mysqli_sql_exception' with message 'Lost connection to MySQL server during query' in /opt/IBM/custom/NAC_Dslam/compare.php:161 – Michele Jun 21 '18 at 15:41
  • If I'm losing connection to the mysql server, would that not help the situation to use limit/offset? – Michele Jun 21 '18 at 15:53

0 Answers0