0

I run an application on my Linux server, which gets data from big DB tables. For instance, for around 100000-200000 rows, the size is about 50M.

But when I get data from table, the memory of the server is far greater, and after a "select" query, memory stays high, and then when I get data again, memory increases even more.

Finally my application runs out of memory. It only takes 3 times to happen. The error status returned at that point is 500. The memory on server is 2GB, and I modified my php.ini file, to set memory_limit=1500M.

I am considering why memory does not get freed after getting data. I have also searched this issue, but still not solved. Hope anyone can help me out here. Thanks in advance.

Watercayman
  • 7,970
  • 10
  • 31
  • 49
  • 3
    Show your code. If you're not doing this already, I'd suggest running the query in a loop and use LIMIT/OFFSET to get a smaller number of rows each time, then build up whatever you need from that query. – aynber May 08 '18 at 14:44
  • you could always chunk your results https://laravel.com/docs/5.6/eloquent#chunking-results – online Thomas May 08 '18 at 14:46
  • Thank you guys! Problem solved by chuck. It only takes 300M-400M of memory. But the memory is getting higher slowly. I am wondering if my application runs a long time the memory will run out. – Charles Mu May 09 '18 at 03:00

1 Answers1

0

If you are using Laravel ORM Eloquent to load a large collection, you can use chunk() or cursor() methods as mentioned in the docs. This will lazy load the collection and reduce your memory usage.

An example:

Inspector::query()
        ->where('status', true)
        ->cursor() //for collection lazy loading
        ->each(function (Inspector $inspector) {
            $inspector->markAsInspected();
        });

Have a look at the doc links for more examples.

Abdelsalam Megahed
  • 1,281
  • 12
  • 13