26

I have just inherited a site with a PHP script that is consistently running out of memory at 117 MB. This happens even when I increase PHP's memory_limit variable to 312 MB, which I'm doing via php.ini.

This is now solved thanks to a great clue from pcguru. See my answer below that begins: I have finally found the answer

ini_get('memory_limit') returns the value set in php.ini, so I'm certain Apache has restarted after changing the value. I'm using memory_get_usage(true) to return the memory consumed by the script at various points along the way. And it's consistently failing when it gets to 117 MB.

Is there some internal PHP limit I'm unaware of that has it never allocate more than 117MB to an individual script?

The server has 1GB of RAM and is running CentOS. I have root shell access. PHP is version 5.3.18. MySQL is version 5.1.66-cll.

This script is behind a username/password and I can't provide public access to it.

Edited to Add:

1) Thanks all for your help to date. You'll find more info in my replies to specific user comments under various answers below.

2) Suhosin is definitely not installed. I've checked in multiple places including running a script and check for constants and running php -v

3) The apache log has no record of the specific error message I'm getting. Logging is switched on in php.ini. I piped through grep to search the entire log.

4) Is it possible the wrong error is being reported in this case?

hakre
  • 193,403
  • 52
  • 435
  • 836
user8109
  • 768
  • 1
  • 5
  • 11
  • 9
    What exactly is the error that you see? Perhaps you're confused by the "tried to allocate xxx bytes" – Ja͢ck Dec 19 '12 at 15:42
  • 4
    Kudos for having an up-to-date PHP 5.3 version. – Levi Morrison Dec 19 '12 at 15:42
  • 2
    Do you have the suhosin patch by any chance? – DaveRandom Dec 19 '12 at 15:43
  • What's the script doing? What's the error you get when it runs out of memory? – Leigh Dec 19 '12 at 15:44
  • It is possible to change the memory limit many times in the execution flow in PHP. It could be that the memory limit is changed by some function/library that you use? What is the memory limit on the line immediately preceding the line where the script fails? – oldwizard Dec 19 '12 at 15:45
  • 1
    http://forums.cpanel.net/f5/php-memory-limit-problem-69715.html – DaveRandom Dec 19 '12 at 15:45
  • 4
    I know this is off-topic, but what are you doing with PHP that needs to allocate 300mb of RAM? That's quite a lot more than your typical PHP script. There are plenty of perfectly good reasons for needing that kind of RAM, so I'm not trying to critisise, but I would be interested to know more, as I've seen a lot of cases where people have written PHP programs to load vast amounts of data into memory when it was more efficient (and usually quicker too) to only load a bit of the data at a time. – SDC Dec 19 '12 at 15:51
  • SDC: There are a lot of use cases, where 300mb is easily passed. E.g. image processing, PDF generation, etc. – aebersold Dec 19 '12 at 15:57
  • Since memory_limit can be set in both the ini, htaccess file or in code with ini_set. I would check what the value of the setting with ini_get to see if its what you expect. – datasage Dec 19 '12 at 16:01
  • @Jack - the full error message is: Fatal error: Out of memory (allocated 123207680) (tried to allocate 79 bytes) in /home/tankbase/public_html/companies4.php on line 248. – user8109 Dec 19 '12 at 16:14
  • @DaveRandom - I don't know what the suhosin patch is. I've searched for suhosin on the output of phpinfo() and it doesn't show. – user8109 Dec 19 '12 at 16:16
  • @Leigh - see reply to Jack above for error. The script is loading massive amounts of data into a PHP array. It then runs a foreach loop and merges thousands of records. It runs our of memory at record #7400 and something. NOTE: I didn't write it, wouldn't do it this way, but need a temporary fix and there's RAM to spare. – user8109 Dec 19 '12 at 16:18
  • @pcguru - I used find/grep to find all instances of ini_set and checked them one by one. Nothing is setting PHP memory. Also, I had the script itself report to me the memory_limit (which doesn't change throughout). – user8109 Dec 19 '12 at 16:19
  • @DaveRandom - I entered this at the prompt: cat /usr/local/lib/php.ini | grep suhosin. It returned nothing. According to the forum you sent me to, Suhosin is configured through php.ini so assume I don't have it installed. I did the above with 'memory' as well. I got 2 comments and the usual memory_limit that I've been changing. – user8109 Dec 19 '12 at 16:23
  • @SDC - yes I know I know. I didn't write this and currently believe it should be rewritten to use temporary tables. But it's not a simple query and I'm looking for a quicker fix so I have time to properly analyse what's happening. It may be those massive arrays are actually needed by other elements within the script/page. – user8109 Dec 19 '12 at 16:25
  • @WayneDavies It sounds like that's not the answer then. Are you sure your system isn't simply running out of available memory to allocate? – DaveRandom Dec 19 '12 at 16:25
  • @datasage - good point. I'd forgotten about .htaccess. Am checking now. No, .htaccess is present but empty. – user8109 Dec 19 '12 at 16:27
  • Did you check what the memory limit is where the script execution fails? Do something like echo ini_get('memory_limit'); die; just before line 248. – oldwizard Dec 19 '12 at 20:44
  • @pcguru - yes I have done that thanks. I think your answer below ('since the server has only 1 GB of RAM') is the answer. It's the only thing (I've heard/found so far) that fits all the facts. I'm going to do a bit more checking. – user8109 Dec 20 '12 at 07:05

5 Answers5

24

I have finally found the answer. The clue came from pcguru's answer beginning 'Since the server has only 1 GB of RAM...'.

On a hunch I looked to see whether Apache had memory limits of its own as those were likely to affect PHP's ability to allocate memory. Right at the top of httpd.conf I found this statement: RLimitMEM 204535125

This is put there by whm/cpanel. According to the following webpage whm/cpanel incorrectly calculates its value on a virtual server... http://forums.jaguarpc.com/vps-dedicated/17341-apache-memory-limit-rlimitmem.html

The script that runs out of memory gets most of the way through, so I increased RLimitMEM to 268435456 (256 MB) and reran the script. It completed its array merge and produced the csv file for download.

ETA: After further reading about RLimitMEM and RLimitCPU I decided to remove them from httpd.conf. This allows ini_set('memory_limit','###M') to work, and I now give that particular script the extra memory it needs. I also doubled the RAM on that server.

Thank you to everyone for your help in detecting this rather thorny issue, and especially to pcguru who came up with the vital clue that got me to the solution.

user8109
  • 768
  • 1
  • 5
  • 11
10

Since the server has only 1 GB of RAM I'm leaning towards the possibility that you have actually run out of system memory entirely.

See this thread. You get the same "PHP Fatal error: Out of memory" instead of the more common "Fatal error: Allowed memory size of ...". Your error indicates the system cannot allocate more memory at all, meaning even PHP:s internal functions cannot allocate more memory, let alone your own code.

How is PHP configured to run with Apache? As a module or as CGI? How many PHP processes can you have running at the same time? Do you have swap space available?

If you use PHP as a module in Apache, Apache has a nasty habit of keeping memory that the PHP process allocated. Guessing since it can't restart the PHP module in the worker, just restart the worker entirely. Each worker that has served PHP simply grows to the PHP memory limit over time as that worker serves a script that allocates a lot of RAM. So if you have many workers at the same time, each using 100 MB+, you will quickly run out of RAM. Try limiting the number of simultaneous workers in Apache.

Community
  • 1
  • 1
oldwizard
  • 5,012
  • 2
  • 31
  • 32
  • Thanks pcguru - this makes sense and fits the facts. To answer your questions: PHP is run as a module. There is swap space available and being used - it's a virtual server running 8 sites. There are some largish DB tables - biggest has 2.4 million records. There are plenty of site visitors, and this report is one of dozens their staff could be running. Your explanation also answers another puzzle, which is why did the report succeed in the wee small hours the night before (when I thought I'd fixed the problem :-). I'm about to watch it run with top in another window. – user8109 Dec 20 '12 at 07:10
  • running top shows the system doesn't running out of physical RAM while the query is running. It dropped to around 60MB just before crashing, but still had 1.6GB of swap space. This was a great suggestion and it may have lead me to the actual answer, so thanks for that. I'm just testing now. – user8109 Dec 20 '12 at 07:41
  • 1
    Good to hear it, there are ways to reduce memory usage with Apache/PHP but it generally comes with lower performance as a result. I hope you get it working. Otherwise one can always buy more RAM for the server. :) – oldwizard Dec 20 '12 at 08:40
  • Heh - funny you should say that. The client has just agreed to double the ram and add another cpu. It's a virtual so stop the instance, click a few buttons, and restart. I love virtuals! Hopefully Cpanel/WHM won't come along and clobber my edit to httpd.conf. – user8109 Dec 20 '12 at 11:11
4

This may not be the answer to your problem, but if you run PHP from the command line you can overrite the memory limit from php.ini.

php -d memory_limit=321M my_script.php

I'm not exactly sure what the default memory limit via cli is.

Also, you can run php --ini and check the results.

aebersold
  • 11,286
  • 2
  • 20
  • 29
  • php --ini returns:Configuration File (php.ini) Path: /usr/local/lib Loaded Configuration File: /usr/local/lib/php.ini Scan for additional .ini files in: (none) Additional .ini files parsed: (none) – user8109 Dec 19 '12 at 16:10
  • Also - I can't run the script from the prompt because the part that crashes requires several post variables. Unfortunately it's quite complex, using massive arrays for reasons that make no sense to me. In my view it should be using temporary tables and just sending the final result to PHP. But in the meantime, if I can give it 198MB it will complete which is a useful temporary fix. – user8109 Dec 19 '12 at 16:13
  • Understand that. Really strange behaviour. – aebersold Dec 19 '12 at 16:59
1

This isn't an answer to why your script is dying after a certain memory usage, but you can get around it by removing the memory limit entirely within the PHP script itself:

ini_set('memory_limit', '-1');

This is dangerous. If you have a runaway script, PHP will take memory until your server has none left to allocate and falls over. So you should only use this if you're sure the script itself isn't a problem, and only to test the output.

As to if PHP has some per-script limit on memory usage, no. I have personally run scripts with near 1GB memory usage.

K.A.F.
  • 2,277
  • 1
  • 16
  • 17
  • Thanks M_user. I can't run the risk of giving PHP carte-blanch. The client's sites are accessed internationally so there isn't convenient downtime. Useful to know PHP isn't imposing a limit - thanks for that. Is it possible the error is being miss-reported? – user8109 Dec 19 '12 at 16:44
  • That's not a good advice. I'd say, optimize your app to use less memory instead of allowing unlimited memory use. If a script is dying before the memory limit is reached, it wouldn't matter how much you increase it, the script will still die. – phoenix Sep 06 '22 at 20:54
-7

You have an infinite loop somewhere in your code.

Peter Kiss
  • 9,309
  • 2
  • 23
  • 38
  • 2
    How can you say that without looking at the code. It's a best guess that shouldn't be posted as an answer. – webnoob Dec 19 '12 at 16:13
  • If raising the memory limit does not help this is the second stuff which can cause this type of an error. Anyone can try it whith an infinite loop or with a bad recursion function. – Peter Kiss Dec 19 '12 at 16:17
  • It's not an infinite loop. That would require an infinite array! It's consistently crashing at 117MB despite (supposedly) having more RAM available. What's more, the loop in which the crash occurs is doing an array merge. Each iteration nibbles around 79 more bytes. – user8109 Dec 19 '12 at 16:30