0

I have a PHP script which returns lots of rows from my MySQL DB. Usually no issue occurs but occasionally a large amount of rows are returned which causes memory exhaustion.

The fetch is here:

$fetchall = mysqli_fetch_all($getPrepData->get_result(), MYSQLI_ASSOC);

I am wondering what an alternative is the mysqli_fetch_all which would be better for memory to prevent this issue without increasing memory limit etc

Update:

I have added the following as suggested in the comments

$result = $getPrepData->get_result();
$data = [];
while ($row = $result->fetch_assoc()) {
     $data[] = $row;
}

But I still get a memory error on this line:

 $result = $getPrepData->get_result();
user3783243
  • 5,368
  • 5
  • 22
  • 41
  • Fetch them in a loop one by one. – Markus Zeller Jul 03 '22 at 13:24
  • @MarkusZeller I have added an update to the question. I have tried the looping as you suggest but still get a memory error $result = $getPrepData->get_result(); – Talktomegoose Jul 03 '22 at 13:32
  • 1
    If you're running out of memory you have two options: increase the memory limit,or use less. There's not enough information here to determine whether the first is appropriate, or how to do the second. – Tangentially Perpendicular Jul 03 '22 at 13:52
  • Now, you've the same problem. Why do you put them all into an array instead of working with the dataset? – Markus Zeller Jul 03 '22 at 13:57
  • 1
    Maybe see https://www.php.net/manual/en/mysqlinfo.concepts.buffering.php – user3783243 Jul 03 '22 at 13:59
  • @MarkusZeller Working with the data set would still use the same amount of memory. – Dharman Jul 03 '22 at 15:19
  • 1
    Note that it makes no sense to add the returned rows into array. this way you will have the same error again. Instead, you have to process the rows right in place. – Your Common Sense Jul 03 '22 at 15:21
  • Why do I feel like there is something fundamentally wrong with how you are trying to write the data out. – OldWest Jul 03 '22 at 21:02
  • @OldWest legacy code i am trying to improve. Backstory for each users dataset: each row has lots of different times eg time1 and time2 (65 more). On the site we generate paginated running totals for each of the 65 different type of time, (page totals, prev page and grand) and and page has 18 rows. These totals can only be generated (currently) by getting all the rows looping (paginated) and adding them up. For most users that have up to a 1000 rows this is no issue and quick but is some user had 10000 rows then this is slow and memory heavy and this is the solution I am trying fix. – Talktomegoose Jul 03 '22 at 21:28
  • @Dharman No,, it would not, because the memory will be reused each iteration. – Markus Zeller Jul 04 '22 at 07:55

0 Answers0