0

I have a problem with Laravel's ORM Eloquent cursor() method. Im trying to create some analytics functions for my ecom to get the amount of orders/income and stuff like that, you can check below my code:

$fromDate = Carbon::now()->startOfYear()->toDateString();
$tillDate = Carbon::now()->endOfYear()->toDateString();

$orders = Order::cursor()->whereBetween('created_at', [$fromDate, $tillDate])->whereIn('status', array(1, 2, 3))->filter(function ($order) { 
  return $order; 
});
$countO = $orders->count();
$sumO = $orders->sum('total');

$orders = VoucherOrder::cursor()->whereBetween('created_at', [$fromDate, $tillDate])->where('status', 1)->filter(function ($order) {
  return $order;
});
$countV = $orders->count();
$sumV = $orders->sum('total');

$orders = $countO + $countV;
$income = $sumO + $sumV;

if($orders != 0) {
  $avgOrder = $income / $orders;
}
else {
  $avgOrder = 0;
}

return view('admin.income', compact('orders', 'income', 'avgOrder'));

I used cursor() in order to reduce the memory usage since i'm processing large amounts of data but i still get this error:

Allowed memory size of 536870912 bytes exhausted (tried to allocate 2338688 bytes)

what am i doing wrong? how can i solve?

Mono.WTF
  • 1,587
  • 3
  • 18
  • 28

3 Answers3

1

If you want to reduce memory usage without editing php.ini file, I suggest using chunk than cursor().

Model::chunk(250, function ($data) {
    foreach ($d as $data) {
        // your code
    }
});

This will take less memory but do keep in mind this may consume few seconds more for execution in contrast to cursor().

sid
  • 1,779
  • 8
  • 10
0

I've wrote a blog post about it. But basically, you need to use unbuffered queries if you don't want to have an out of memory.

Blog post: https://medium.com/@JoseCardona/boost-your-apps-performance-processing-large-mysql-tables-with-eloquent-orm-3724421d97aa

Joskfg
  • 194
  • 1
  • 4
-1

try editing /etc/php5/fpm/php.ini:

memory_limit = 2048M restart nginx:

sudo systemctl restart nginx
Waleed Muaz
  • 737
  • 6
  • 17
  • any idea on how to solve without increasing memory_limit? – Mono.WTF Jul 16 '21 at 10:50
  • You can chunk the cache as well if you can't load it all at once. Do you need all the data to be available at the same time? That is a lot of records to load into memory at once. Maybe you can manage to work things out so that you can do what you need to with those records a chunk at a time and remove them from memory when you are done with them. – Waleed Muaz Jul 16 '21 at 10:51
  • the page shows the income and the amount of orders based on a date range, so ye if i select the whole year there are a lot of data to process – Mono.WTF Jul 16 '21 at 10:58