5

I've stumbled across an odd issue while optimising my Apache + PHP memory usage. Basically the code blows up with the error message "Fatal error: Allowed memory size of 16777216 bytes exhausted (tried to allocate 50331646 bytes)" when trying to bind the results of a MySQLi query.

The relevant tables are:

CREATE TABLE `note` (
  `noteID` int(11) NOT NULL AUTO_INCREMENT,
  `contentID` int(11) NOT NULL,
  `text` mediumtext NOT NULL,
  PRIMARY KEY (`noteID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

CREATE TABLE `content` (
  `contentID` int(11) NOT NULL AUTO_INCREMENT,
  `text` varchar(2048) NOT NULL,
  `datestamp` datetime NOT NULL,
  PRIMARY KEY (`contentID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

and the query that blows up is:

select content.contentID, 
content.text, 
content.datestamp, 
note.noteID, 
note.contentID, note.text 
from basereality.content as content 
inner join basereality.note as note 
on content.contentID = note.contentID 
where content.contentID = 1028 ;

Running the query in MySQL on the server runs fine, and the size of 'note' returned is under a kilobyte.

One weird thing is that the size that it is trying to allocate 50331646 is 0x2FFFFFE in hex which is a suspiciously round number. It's almost like PHP is trying to allocate a buffer large enough to hold the largest mediumtext field possible, rather than actually allocating memory for the data retrieved.

Anyone know a workaround for this - other than to increase the max memory allowed in PHP?

btw Other people had the same issue but didn't seem to know how to resolve it. Out of memory (allocated 50855936) (tried to allocate 50331646 bytes)

The exact line that blows up is:

$statement->bind_result(
  $content_contentID,
  $content_text,
  $content_datestamp,
  $note_noteID,
  $note_contentID,
  $note_text);

Changing the query to select "'A test string' as note", rather than fetching the column no longer shows the massive memory usage.

btw I am sure that I'm not trying to retrieve large data. This shows the actual length of the mediumtext field in the note table:

 select length(text) from basereality.note;
+--------------+
| length(text) |
+--------------+
|          938 |
|          141 |
|         1116 |
|          431 |
|          334 |
+--------------+
Community
  • 1
  • 1
Danack
  • 24,939
  • 16
  • 90
  • 122
  • Can you show the *exact* line of PHP code in which this error comes up? – Pekka Aug 11 '12 at 10:17
  • Done - don't think it will help. – Danack Aug 11 '12 at 10:27
  • Are you 1000% sure none of the database fields that you are fetching is > 50MB large? – Pekka Aug 11 '12 at 10:29
  • Yep, added the size of the mediumtext field to the question. Only one is over a kilobyte. – Danack Aug 11 '12 at 10:36
  • Increase the memory to see if you get any other error or exception. I had a memory warning once too and increasing the memory locally to avoid the memory warning I got an exception instead. Fixing that, the memory problem disappeared. – mAu Aug 11 '12 at 10:45
  • 5
    I performed a brief test to reproduce this: PHP 5.3 on Windows enables [`mysqlnd`](http://us.php.net/manual/en/mysqlnd.install.php) as a replacement for the underlying connector to the database, no bug or increased memory usage. PHP 5.3 on Linux does not enable it by default, and increased memory usage was observed (it did not crash because memory_limit is far above 16 MB, but memory usage was about 17 MB). Reading the PHP source gave me an idea to try `$stmt->attr_set(MYSQLI_STMT_ATTR_UPDATE_MAX_LENGTH, 1);` before `$stmt->execute()`, which brought the memory usage back to normal levels. – DCoder Aug 11 '12 at 11:29
  • mAu - It was working fine before I reduced the max memory limit. I've just tested again and it works fine with memory increased to 128MB and then breaks again when reduced to 16MB. – Danack Aug 11 '12 at 12:57
  • DCoder - I tried your suggestion and added that line, immediately after the prepare and before the execute. It stopped crashing, but also started returning only a single character for all varchar and mediumtext fields. I'm running on an Amazon Linux image and so will try your suggestion of using mysqlnd tomorrow. – Danack Aug 11 '12 at 13:03
  • @Danack57: sorry, I just `var_dump()`ed the memory usage without checking the actual returned data, so I didn't notice that this ATTR causes such a drastic change. Clearly that is unusable :) Let's hope mysqlnd helps. (Additional testing shows the PDO MySQL driver doesn't have this problem.) – DCoder Aug 11 '12 at 14:02
  • 4
    DCoder - I checked the MySQLi source also. That attribute is actually a long not a bool http://bugs.mysql.com/bug.php?id=16144 But setting it to anything other than 1 is apparently not supported in PHP. https://bugs.php.net/bug.php?id=62798 – Danack Aug 11 '12 at 15:46
  • Dcoder - yep, changing to mysqlnd fixed this. Fancy writing that as the answer for me to mark correct? – Danack Aug 12 '12 at 09:22

2 Answers2

4

I found the answer and commented in another question:

Allowed memory size of 67108864 bytes exhausted

Basically, in old MySQL connector library, rather than just allocating the actual size of the field 'text' to store the result, instead the maximum possible size of the type mediumtext is allocated.

i.e. 16 megabytes is always allocated even if the field for the result is only 100 kilobytes.

The easiest and probably best way to fix this is to switch to using the new MySQLND connector which doesn't show this behaviour.

Community
  • 1
  • 1
Danack
  • 24,939
  • 16
  • 90
  • 122
0

Have you tried mysqli_store_result? Look on the php.net link below more people have the same problem and some solved it with calling the store_result before binding the result.

http://php.net/manual/en/mysqli.store-result.php

http://dev.mysql.com/doc/refman/5.0/en/mysql-stmt-store-result.html

Michael
  • 51
  • 4
  • Thanks for the suggestion but my MySQL code is auto-generated from the table definitions, so switching it to use a different way of accessing the tables when there are blob or text fields present would not be easy. – Danack Aug 14 '12 at 09:08