77

In the same system, I can make call to db, and there is no problem, but in some case ( with the biggest table ), I get

"PHP Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 32 bytes) in /home/forge/sximo.sp-marketing.com/vendor/laravel/framework/src/Illuminate/Database/Connection.php on line 311

I debugged the code and the problem is a basic query:

"  SELECT partidascapturainfo.* FROM partidascapturainfo    WHERE partidascapturainfo.partidascapturainfoid IS NOT NULL       ORDER BY partidascapturainfoid asc   LIMIT  0 , 10 "

When I run the query in a Mysql Client, query runs in 0.17s

I've already set memory_limit to 2048, restart nginx and my query only return 10 rows...

Here are my 10 rows:

123044,42016,249,3762,2,,0
123045,42016,249,3761,2,,0
123046,42016,249,3764,1,,0
123047,42016,249,3765,,,0
123048,42016,249,3775,,,0
123049,42016,249,3771,3,,0
123050,42016,249,3772,3,,0
123051,42016,250,3844,HAY,,0
123052,42016,255,3852,,,0
123053,42017,249,3761,1,,0

Any Idea what's going on???

Kenny Horna
  • 13,485
  • 4
  • 44
  • 71
Juliatzin
  • 18,455
  • 40
  • 166
  • 325
  • If you show us what kind of data you are retrieving, it could just be that the data in those 10 rows is massive. You might want to profile your code to see where the memory is getting eaten up. – jardis Jan 18 '16 at 22:37
  • how can I profile my code? I will update my question with a row as example – Juliatzin Jan 18 '16 at 22:41
  • The simplest way is http://stackoverflow.com/a/880483/3358181, but there are more advanced tools. – jardis Jan 18 '16 at 23:10
  • 1
    Possible duplicate of [Allowed memory size of 33554432 bytes exhausted (tried to allocate 43148176 bytes) in php](https://stackoverflow.com/questions/415801/allowed-memory-size-of-33554432-bytes-exhausted-tried-to-allocate-43148176-byte) – Muhammad Mar 06 '18 at 18:50
  • Does this answer your question? [Fatal Error: Allowed Memory Size of 134217728 Bytes Exhausted (CodeIgniter + XML-RPC)](https://stackoverflow.com/questions/561066/fatal-error-allowed-memory-size-of-134217728-bytes-exhausted-codeigniter-xml) – Your Common Sense Jul 23 '22 at 05:45

22 Answers22

121

Warning: please be advised that answer below is not a solution. One cannot increase the memory limit indefinitely. Another answer is required that helps to reduce the memory consumption instead of increasing the memory limit. As a very first measure try to update your Laravel/Eloquent versions, chances that this error is fixed in a newer version.

You can try editing /etc/php5/fpm/php.ini:

; Old Limit
; memory_limit = 512M

; New Limit
memory_limit = 2048M

You may need to restart nginx:

sudo systemctl restart nginx

You may also have an infinite loop somewhere. Can you post the code you're calling?

Legionar
  • 7,472
  • 2
  • 41
  • 70
Deciple
  • 1,864
  • 3
  • 16
  • 19
31

It is happened to me with laravel 5.1 on php-7 when I was running bunch of unitests.

The solution was - to change memory_limit in php.ini but it should be correct one. So you need one responsible for server, located there:

/etc/php/7.0/cli/php.ini

so you need a line with

 memory_limit

After that you need to restart php service

sudo service php7.0-fpm restart

to check if it was changed successfully I used command line to run this:

 php -i

the report contained following line

memory_limit => 2048M => 2048M

Now test cases are fine.

Yevgeniy Afanasyev
  • 37,872
  • 26
  • 173
  • 191
  • I have confirmed an unlimited memory: `memory_limit => -1 => -1`. However I'm still getting errors about out of memory (`Out of memory (allocated 8388608) (tried to allocate 217088 bytes)`). – JCarlosR May 15 '18 at 21:16
  • I have never suggested setting your PHP to unlimited memory. – Yevgeniy Afanasyev May 16 '18 at 03:41
  • `php -i` displayed `memory_limit => -1 => -1` but in `/etc/php7.1/fpm/php.ini` has `memory_limit=2048M` Am I missing anything? – Azima Feb 10 '21 at 11:06
  • try php -v please – Yevgeniy Afanasyev Feb 11 '21 at 00:16
  • @Azima Yes, mine too. I change `memory_limit` at `/apache/php.ini` file. In your case, it's `fpm/php.ini`. And then I restart service apache. I don't edit the `php/php.ini` file. It remains `memory_limit=-1`. – ibnɘꟻ Oct 11 '21 at 02:55
8

Edit the following two files (you may only have one of them):

sudo nano /etc/php/7.3/fpm/php.ini

sudo nano /etc/php/7.3/cli/php.ini

Change the memory limit, for example from 512M to 1024M:

memory_limit = 1024M

Restart nginx:

sudo systemctl restart nginx

Check the memory_limit

php -i | grep "memory_limit"

Run your software to check if the new memory_limit is sufficient.

If not, increase it again (and repeat the steps above). To 2048M, for instance:

memory_limit = 2048M
Darren Murphy
  • 1,076
  • 14
  • 12
  • not solve alr tried the error is same but the alert more than smaller Allowed memory size of 268435456 bytes exhausted (tried to allocate 16384 bytes) – Yogi Arif Widodo Feb 22 '22 at 03:24
4

Share the lines of code executed when you make this request. There might be an error in your code.

Also, you can change the memory limit in your php.ini file via the memory_limit setting. Try doubling your memory to 64M. If this doesn't work you can try doubling it again, but I'd bet the problem is in your code.

ini_set('memory_limit', '64M');
Alec Walczak
  • 427
  • 1
  • 5
  • 17
4

I realize there is an accepted answer, and apparently it was either the size of memory chosen or the infinite loop suggestion that solved the issue for the OP.

For me, I added an array to the config file earlier and made some other changes prior to running artisan and getting the out of memory error and no amount of increasing memory helped. What it turned out to be was a missing comma after the array I added to the config file.

I am adding this answer in hopes that it helps someone else figure out what might be causing out of memory error. I am using laravel 5.4 under MAMP.

StevenHill
  • 320
  • 3
  • 8
  • This is absolutely correct. In addition to this answer, I'd suggest analyzing the code properly and figuring out where the error is. In my case i was trying to access undefined variables but because it was on the Artisan console it throws such error – Alemoh Rapheal Baja May 24 '22 at 08:30
4

I suggest you check your Apache server virtual memory in MB and then set the memory limit.

because Server crash problem if your Apache server has 1024 Mb of virtual memory and set 2048 Mb.

ini_set use in php code

ini_set('memory_limit', '512M');

Also you can change in php.ini:

memory_limit = 512M

Find the path location of php.ini using the command line in Linux or Windows:

php --ini

Php ini file location find

Change in Php.ini Memory Limit Settings on Ubuntu

  • 1
    Thanks for your answer. This solution treats the symptom but not the cause. If there's a process that uses fairly more memory than configured there might be something wrong with the process. – shaedrich May 27 '21 at 07:17
3

This problem occurred to me when using nested try- catch and using the $ex->getPrevious() function for logging exception .mabye your code has endless loop. So you first need to check the code and increase the size of the memory if necessary

 try {
        //get latest product data and latest stock from api
        $latestStocksInfo = Product::getLatestProductWithStockFromApi();
    } catch (\Exception $error) {
        try {
            $latestStocksInfo = Product::getLatestProductWithStockFromDb();
        } catch (\Exception $ex) {
            /*log exception */
            Log::channel('report')->error(['message'=>$ex->getMessage(),'file'=>$ex->getFile(),'line'=>$ex->getLine(),'Previous'=>$ex->getPrevious()]);///------------->>>>>>>> this problem when use 
            Log::channel('report')->error(['message'=>$ex->getMessage(),'file'=>$ex->getFile(),'line'=>$ex->getLine()]);///------------->>>>>>>> this code is ok 
        }
        Log::channel('report')->error(['message'=>$error->getMessage(),'file'=>$error->getFile(),'line'=>$error->getLine()]);

        /***log exception ***/
    }
fatemeh sadeghi
  • 1,757
  • 1
  • 11
  • 14
3

For litespeed servers with lsphp*.* package.

Use following command to find out default set memory limit for PHP applications.

php -r "echo ini_get('memory_limit').PHP_EOL;"

To locate active php.ini file from CLI

php -i | grep php.ini

Example:

/usr/local/lsws/lsphp73/etc/php/7.3/litespeed/php.ini

To change php.ini default value to custom:

php_memory_limit=1024M #or what ever you want it set to
sed -i 's/memory_limit = .*/memory_limit = '${php_memory_limit}'/' /usr/local/lsws/lsphp73/etc/php/7.3/litespeed/php.ini

Dont forget to restart lsws with: systemctl restart lsws

Yashodhan K
  • 83
  • 1
  • 5
2

I got this error when I restored a database and didn't add the user account and privileges back in. Another site gave me an authentication error, so I didn't think to check that, but as soon as I added the user account back everything worked again!

Dylan Glockler
  • 1,115
  • 1
  • 20
  • 40
2

Sometime limiting your data is also helpful, for example checkout the followings:

//This caused error:
print_r($request);

//This resolved issue:
print_r($request->all());




    
Naser Nikzad
  • 713
  • 12
  • 27
1

I had the same problem. No matter how much I was increasing memory_limit (even tried 4GB) I was getting the same error, until I figured out it was because of wrong database credentials setted up in .env file

GarryOne
  • 1,430
  • 17
  • 21
1

I had this problem when trying to resize a CMYK jpeg using the Intervention / gd library. I had to increase the memory_limit.

Keith Turkowski
  • 751
  • 7
  • 11
1

While using Laravel on apache server there is another php.ini

 /etc/php/7.2/apache2/php.ini

Modify the memory limit value in this file

 memory_limit=1024M

and restart the apache server

 sudo service apache2 restart
athulpraj
  • 1,547
  • 1
  • 13
  • 24
1

If you in the same story as me:

  • Laradock or other Docker containers
  • No errors
  • No logs
  • Laravel request eats our all your memory in docker and killed silently
  • You even tried to debug your code with xDebug and nothing happened after Exception

The reason is xDebug 3.0 with develop enabled, it prints detailed info about the exceptions and in the case of Laravel take a lot of memory, even 3GB not enough, it dies, without printing any info about it.

Just disable xDebug, at least develop, and it will be fine.

KorbenDallas
  • 944
  • 9
  • 16
  • I am on the same situation with Laradock. How do you disable `xDebug` & `develop`? Could you please give more details? Here is the error from the logs: `production.ERROR: Allowed memory size of 134217728 bytes exhausted (tried to allocate 2621440 bytes) {"exception":"[object] (Symfony\\Component\\ErrorHandler\\Error\\FatalError(code: 0): Allowed memory size of 134217728 bytes exhausted (tried to allocate 2621440 bytes) at /var/www/vendor/laravel/framework/src/Illuminate/View/ComponentSlot.php:7) [stacktrace]` – Pathros May 25 '22 at 18:55
  • @Pathros in your `php.ini` or `xdebug.ini` change string `xdebug.mode = develop` to `debug` or `off` – KorbenDallas May 25 '22 at 19:28
  • More info https://xdebug.org/docs/all_settings#mode – KorbenDallas May 25 '22 at 19:28
  • I have checked the `.env` file and I have it set it to false: `WORKSPACE_INSTALL_XDEBUG=false` & `PHP_FPM_INSTALL_XDEBUG=false`. So my memory problem in the `php-worker` container must be something else. Any ideas? – Pathros May 25 '22 at 19:49
  • Did you rebuild your php-fpm container and recreated it? – KorbenDallas May 25 '22 at 19:51
1
  1. On a Mac, run the following in the terminal

php -r “echo ini_get(‘memory_limit’).PHP_EOL;”

This will tell you your current memory limit.

  1. Then, run

php -i | grep php.ini

This will show you where your php.ini file is.

  1. Open this file and increase the memory limit. For example, limitless is:

memory_limit = -1

  1. Run the first command again to test

php -r “echo ini_get(‘memory_limit’).PHP_EOL;”

  1. (optional) If it still doesn't work, go to your memory limits file. Mine was

/usr/local/etc/php/8.0/conf.d/php-memory-limits.ini

In here, change the memory_limit

Brad Ahrens
  • 4,864
  • 5
  • 36
  • 47
1

Check that you are not calling @extends recursively. In my case I had

@extends('layouts.app')

in resources/views/layouts/app.blade.php

John Muraguri
  • 426
  • 7
  • 6
0

I had also been through that problem. in my case, I was adding the data to the array and passing the array to the same array which brings the problem of memory limits. Some of the things you need to consider:

  1. Review our code, look if any loop is running infinity.

  2. Reduce the unwanted column if you are retrieving the data from the database.

  3. Maybe you can increase the memory limits in our XAMPP other any other software you are running.

Lakmi
  • 1,379
  • 3
  • 16
  • 27
0

for xampp it there is in xampp\php\php.ini now mine new option in it looks as :

;Maximum amount of memory a script may consume
;http://php.net/memory-limit
memory_limit=2048M
;memory_limit=512M
CodeToLife
  • 3,672
  • 2
  • 41
  • 29
0

I have experience this same problem today. By referring to answer which states about looping issue, I found that my problem is related to Model has a relationship HasOne that is by default is eager loading in $with. Its keep looping because child getting parent. Then, the parent also has this $with getting the child.

MuAf
  • 29
  • 1
  • 7
  • This does not really answer the question. If you have a different question, you can ask it by clicking [Ask Question](https://stackoverflow.com/questions/ask). To get notified when this question gets new answers, you can [follow this question](https://meta.stackexchange.com/q/345661). Once you have enough [reputation](https://stackoverflow.com/help/whats-reputation), you can also [add a bounty](https://stackoverflow.com/help/privileges/set-bounties) to draw more attention to this question. - [From Review](/review/late-answers/32852252) – zkanoca Oct 06 '22 at 22:24
0

The memory_limit solution is sort of poor boy approach. The actual issue is your application booting. So keep removing providers from your config/app.php untill the unit test works fine.

Shahid Karimi
  • 4,096
  • 17
  • 62
  • 104
0

Sometimes it is not you code but you configuration. Try a composer dump-autoload so your laravel does not stay looking for old dead files or routes cached before. You can rise the memmory sky high and it will be reached anyway, so, dump old config and see. Worked for me.

Laynier Piedra
  • 425
  • 6
  • 16
0

In my case, the error was generated by trying to configure two connections to the same database.

cladelpino
  • 337
  • 3
  • 14