2

I have a do while code that reaches to an API to get data from AWS Elastic search. When I increase size, I get memory error. Now I have reduced size to 20, but want to loop through the results as a cronjob yet, it fails. I have increased memory to 4096 yet, it fails

I have increased memory limit in my .ini file, yet it fails. Since its a cronjob, I have no idea how to use scroll.

do {
  $transporter  = new TransporterController;
  $response = $transporter->dispatchSearch($paper->title, '55%', $size, $from, $rounds );
  $json_response = json_decode((string) $response->getBody(), true);
  $rounds = $json_response['rounds'];
  $from = $json_response['from'] + $json_response['size'];

  $response = $transporter->purify($response, $object);
  $impact_user_id = $paper->impact_user_id;
  $impact_type_id = 2;
  $response = $transporter->saveResult($response, $impact_user_id, $impact_type_id);
  $transporter->notifier($response, $user);
} while ($rounds > 1);

The idea is for the loop to run to completion, until last page. This is a laravel cronjob.

Alexander van Oostenrijk
  • 4,644
  • 3
  • 23
  • 37
Ujah Jideofor
  • 73
  • 1
  • 7
  • 3
    Possible duplicate of [Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php](https://stackoverflow.com/questions/415801/allowed-memory-size-of-33554432-bytes-exhausted-tried-to-allocate-43148176-byte) – Samir Selia Jan 07 '19 at 09:13
  • chances are your loop is going on forever. Have you tried debugging the `$json_response['rounds']` value? – Jerodev Jan 07 '19 at 09:17
  • Yes, I have restructured the code to avoid using the loop. I save the last page to the database and query again using the last page as starting interval. – Ujah Jideofor Jan 08 '19 at 10:05

0 Answers0