3

On my server I have the following error:

Allowed memory size of 268435456 bytes exhausted

This happens in a loop (a foreach one) and when I'm checking the memory usage in the loop with

memory_get_peak_usage();

I obtain 7254128 which is far from the 268435456 exhausted!

I checked at multiple places and the memory usage is not increasing wildly so I really don't know where the problem is!

The same script is working just fine on my local computer where I setted the memory limit to only 16M in my php.ini file

Here is the code causing the problem, but i think it won't be really usefull, it's from a plugin of question2answer open source plateform:

foreach ($badges as $slug => $info) {
    $badge_name=qa_badge_name($slug);
    if(!qa_opt('badge_'.$slug.'_name')) 
            qa_opt('badge_'.$slug.'_name',$badge_name);
    $name = qa_opt('badge_'.$slug.'_name');
}
Charles
  • 50,943
  • 13
  • 104
  • 142
darkheir
  • 8,844
  • 6
  • 45
  • 66
  • So.... probably some recursive failure in either `qa_opt` or `qa_badge_slug`? – Wrikken Dec 12 '12 at 22:16
  • I put it in the post, but I don't think it'll be usefull, it's from a plugin so the syntax is particular. And I know the code is not clean, its not mine but I'm trying to make it work! – darkheir Dec 12 '12 at 22:16
  • @Wrikken it could happen during the foreach? because the first turn of the loop is performing well but the error is thrown on the second. And it's working on my local computer, this add some weirdness! – darkheir Dec 12 '12 at 22:19
  • 1
    Well, then it not only _can_ happen, but _does_ happen within the `foreach` loop, doesn't it? It may be a data issue combined with a logical error. If you haven't got this problem locally, I'd sync that data with the data on your server and see what happens there... – Wrikken Dec 12 '12 at 22:23
  • Tried to sync all the files and the content of both of the databases, but still the same – darkheir Dec 12 '12 at 22:30
  • 2
    Well, you could run an xdebug trace to see what happens right before, which might shed some light on it. But one of those 2 functions mentioned is probably causing trouble. – Wrikken Dec 12 '12 at 22:32
  • 1
    Which version of PHP are you using locally and remote? – Arjan Dec 12 '12 at 22:44
  • @Arjan in local it's 5.3.6 and on the remote server 5.3.15 – darkheir Dec 13 '12 at 08:32

1 Answers1

1

I strongly suspect your problem will be due to a 'mis-feature' in the standard MySQL connector library. When a row that contains a 'long blob' or 'long text' field is fetched from the database. Instead of allocating the exact size required for the data, the MySQL library is trying to allocate the largest size that could be possibly be needed to store the row. i.e. 4 gigabytes of memory.

The easiest way to fix this is to switch to using the MySQL ND connector which doesn't have this 'feature'

PHP massive memory usage for SQL query

Allowed memory size of 67108864 bytes exhausted

I'm checking the memory usage in the loop with memory_get_peak_usage(); I obtain 7254128 which is far from the 268435456 exhausted!

That is correct. The memory allocation is failing so peak memory usage never shows the huge allocation as in use.

btw You should have been able to trace the error message to the exact line of code that is generating the failed memory allocation error. If it's not originating from an SQL fetch, then the answer may be wrong.

Community
  • 1
  • 1
Danack
  • 24,939
  • 16
  • 90
  • 122