3

Following up on this Question: How to chunk results from a custom query in Laravel

I try

DB::connection('mgnt')->select($query)->chunk(200, function($orders) {
    foreach ($orders as $order) { 

    //a bunch of code...

    }
});

But I get the following error:

FatalErrorException in MigrationController.php line 98:
 Call to a member function chunk() on array

Is chunking possible without having an appropriate Eloquent ORM Model? I try to chunk because I get a blank page (can't find any errrors in any log) if the query returns too many results.

I think right now it max 50.000 results that I can query at once. Is that maybe due to some restriction or limitation in Laravel?

Community
  • 1
  • 1
kritop
  • 607
  • 1
  • 12
  • 25

1 Answers1

5

Well since the query will just return an array of objects you can simply use PHP's array_chunk():

$result = DB::connection('mgnt')->select($query);
foreach(array_chunk($result, 200) as $orders){
    foreach($orders as $order){
        // a bunch of code...
    }
}

Here's what chunk() on an eloquent model does:

$results = $this->forPage($page = 1, $count)->get();

while (count($results) > 0)
{
    // On each chunk result set, we will pass them to the callback and then let the
    // developer take care of everything within the callback, which allows us to
    // keep the memory low for spinning through large result sets for working.
    call_user_func($callback, $results);

    $page++;

    $results = $this->forPage($page, $count)->get();
}

You could try to do something similar (although I think it should be possible to run your query all at once, but I can't help you with that...)

  1. Add a limit to your SQL query LIMIT 200
  2. Increase the offset with every query you run. First 0, second 1 * 200, third 2 * 200
  3. Do that until the result is returned empty (e.g. with a while loop like above)
lukasgeiter
  • 147,337
  • 26
  • 332
  • 270
  • nice idea, but the problem seems to be the first line as the result is too big to process, saved or received, i was hoping the chunk function of laravel would break apart the request. – kritop Feb 24 '15 at 10:02
  • Ah I see. Then why not use an Eloquent model? I don't know what's limiting the number of results. It may be the PHP execution time or some other setting on the webserver or database server... – lukasgeiter Feb 24 '15 at 10:11
  • it's a complex ETL query with subqueries, variables, calculations, around 60 lines of SQL. I tested it with different MySQL clients and the results takes 17 seconds and returns 250k rows. – kritop Feb 24 '15 at 10:14
  • I've updated my answer. But I really think it should actually be possible to run it all at once... – lukasgeiter Feb 24 '15 at 10:39