0

I have a list data with 999,000 records.

I have a select query and a while loop to get the data, I use array_push to add the retrieved value in loop into one array.

And then I want it so every loop processes 1000 values in this array.

My problem is when use array_push with big data I get the error:

Fatal Error: Allowed Memory Size of 134217728 Bytes

How can I optimize my code to resolve my problem?

My code is below:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = array();
 while ($row = $result ->fetch(PDO::FETCH_ASSOC)) {
    array_push($customerArray , $row);
 }
 // Execute every 1000 record 
 foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
   // My code to execute for every records.
   // ....
 }
Zachary Craig
  • 2,192
  • 4
  • 23
  • 34
  • Let's assume each record of those ~1m records is ~1kb in size, they'll consume ~1GB of memory, plus some overhead. You don't typically want to store that much in memory at once. Why do you need to do that at all? Do your processing in the `while` loop where you fetch from the database, processing one record at a time. – deceze Nov 06 '17 at 15:51
  • If you're going to do something every 1000 records, why not just check if you've hit 1000 within your `while` loop? – Patrick Q Nov 06 '17 at 15:51
  • You need to chunk the query results before you get them. Not after. Use `LIMIT` and `OFFSET` to get paginated results and run as many queries as you need to – apokryfos Nov 06 '17 at 16:05

1 Answers1

1

I'm unsure if it would fix anything, but one thing I will say is, your use of pushing all records into an array is silly.

You're using fetch to fetch them one by one, then adding them all to an array, why on earth aren't you just using PDOStatement::fetchAll() ?

Example:

 $sql = "select customer_id";
 $sql .= " from";
 $sql .= "  t_customer t1";
 $sql .= "  inner join t_mail_address t2 using(mid, customer_id)";
 $result = $conn->query($sql);
 $customerArray = $result->fetchAll(PDO::FETCH_ASSOC);
 // Execute every 1000 record 
 foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) { 
   // My code to execute for every records.
   // ....
 }

This may not fix your memory issue, because we can't see what the heavy lifting is for every customer record, but I will say that while loop you had was silly but most likely not the cause of your memory issue

Depending on if this is a script, or a web page thing, you could also have an incremental loop sort of thing, and use the MySQL LIMIT function to implement basic paging for your data, thus preventing it from coming into memory all at once,

Zachary Craig
  • 2,192
  • 4
  • 23
  • 34