7

My project uses an open source PHP MySQL library https://github.com/ajillion/PHP-MySQLi-Database-Class

But the project mid-year report: "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4294967296 bytes) in / home1/flipalbu/public_html/kvisofttest/login-admin/Lib/class.MysqliDb.php on line 422" This error ,

My server is: linux x86_64

PHP Version 5.4.17

Mysql Version: 5.5.32

memory_limit = 128M

Line 422:call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);

Query part of the code:

    $ db = new MysqliDb ('LocalHost', 'root', 'PASSWD', 'DB');
$ wqdb = $ db-> query ("SELECT * FROM db_table");
foreach ($ wqdb as $ row) {
     $ con. = $ row ['ID'];
}
echo $ con;

Is there any way to solve it?


/** Error Code **/

 protected function _dynamicBindResults(mysqli_stmt $stmt)
        {
            $parameters = array();
            $results = array();

            $meta = $stmt->result_metadata();

            $row = array();
            while ($field = $meta->fetch_field()) {
                $row[$field->name] = null;
                $parameters[] = & $row[$field->name];
            }

            call_user_func_array(array($stmt, 'bind_result'), $parameters);

            while ($stmt->fetch()) {
                $x = array();
                foreach ($row as $key => $val) {
                    $x[$key] = $val;
                }
                array_push($results, $x);
            }
            return $results;
        }
user2637147
  • 81
  • 1
  • 1
  • 5
  • 2
    You’re exhausting the memory PHP has available. Presumably because your database table is massive and you’re just selecting all records without a condition or limit clause. – Martin Bean Aug 08 '13 at 08:47
  • 1
    How many rows are you fetching? – halfer Aug 08 '13 at 08:47
  • 1
    There is - don't select everything from your database. – N.B. Aug 08 '13 at 08:47
  • Increase your memory limit to maximum by `ini_set('memory_limit', '-1');` – 웃웃웃웃웃 Aug 08 '13 at 08:48
  • @User016, no; that might a solution on the command line (and even then, only rarely). However this looks a web application, and you don't want instances to take up large chunks of memory. The query needs to be optimised. – halfer Aug 08 '13 at 08:52
  • @user2637147 look at my answer, you probably have a `longtext` or `longblob` column in your database. this is a known error – x4rf41 Aug 08 '13 at 08:59
  • @x4rf41 Thank you, I was watching. – user2637147 Aug 08 '13 at 09:56
  • For future reference: If you are using that mysqli library linked by the OP, use x4rf41's answer. In short: a column with a type of LONGTEXT creates a memory problem with this library. Use MEDIUMTEXT instead. – David Mar 01 '14 at 23:25
  • I have a same problem, I need to generate a report using the mysql data. I have created the flat table to get those records and I can get results in phpmyadmin retrieving records very fast "rows 0 - 24 (355022 total, Query took 0.0087 seconds.) ". But, I tried execute same in php i am getting below issue "Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)" Please advice. – Hemanathagiribabu Apr 27 '17 at 13:14

4 Answers4

25

I read this bug report here: https://bugs.php.net/bug.php?id=51386

Your problem seems to happen because there is a longblob or longtext in the columns of the table.

longtext / longblob have a maximum length of 4294967295 [4GB] thats why mysqli tries to allocated that memory for the buffer to be sure nothing is lost. I would suggest that you use mediumtext (16777215 [16MB] max length), that should be enough for everything usually.

Update: Because this answer has seen some activity I add this solution from Phil_1984 (see comments)

I use mysqli and after reading that quote from php dev, adding a $stmt->store_result(); between execute and bind_result seems to fix the issues for me

=> If you use $stmt->store_result() you can use mysqli with longblob / longtext without getting the error.

-

Old Answer: I suggest that you either change the column to another type (mediumtext) or use PDO (i think it doesnt have that problem). but if you want to keep the column as longtext, you have to switch your mysql library

Quote from PHP Dev:

This is a known limitation of ext/mysqli when using libmysql (always in 5.2 and previous) and when libmysql is enabled with 5.3 . The reason is that the server sends not too specific metadata about the column. This longtext has a max length of 4G and ext/mysqli tries to bind with the max length, to be sure no data loss occurs (data doesn't fit in the bind buffer on C level). However, that means 4G for a longtext/longblob column. ext/mysqli has been changed to have a way to work around that. You need to call mysqli_stmt_store_result() which will store the data locally, which means, of course a higher memory usage for PHP. However, because you use libmysql this won't hit the PHP's memory limit, for sure. During store_result the max_length of every column will be calculated and then when bind_result is executed only a buffer with size of max_length will be allocated, which will be definitely lower than 4G. In short, prepare execute store_result bind_result fetch...fetch...fetch

x4rf41
  • 5,184
  • 2
  • 22
  • 33
3

If you're trying to read in an entire table in one go, and that table has a lot of rows and columns then running out of memory is inevitable. You can put it off by upping the memory limit in php.ini but the problem will only reoccur when you add a few thousand more rows.

You need to rewrite your script to be more sensible about what it fetches. If you only need specific records then pulling down the entire table and looking for the row(s) you want in the result set is horrible inefficient. Use a WHERE clause to specify what you really want to get. The rule of thumb with PHP/SQL applications is "Use SQL wherever possible to specify what you want, then do what you need to do with it in PHP".

Of course it may be that there's an entirely legitimate reason why you need to process an entire table in PHP. In that case, you should fetch the data in chunks (say 100 rows at a time) with LIMIT and OFFSET, process those rows, get the next chunk, process those and so on until you've gone through the entire table. That will far less memory than trying to load the entire table at once

GordonM
  • 31,179
  • 15
  • 87
  • 129
  • Works in the local virtual can be completely run local virtual machine's memory is 128MB, use the command 'top' control, and not because the query using too much memory. However, one into the server will generate this problem, I need to re-change the whole project all queries you? This code example is also given: SELECT * FROM db_table ORDER BY page_modified DESC, ID DESC LIMIT $ startItem, $ itemsPerPage $ startItem = 0 $ itemsPerPage = 10 – user2637147 Aug 08 '13 at 09:02
  • The table on live is probably far bigger than the test table you have. Your script is trying to allocate 4 gigabytes – GordonM Aug 08 '13 at 09:10
  • 1
    you should really look at my answer, this is a known problem, not even a bug, its mysqli + longtext or longblob – x4rf41 Aug 08 '13 at 09:12
  • My table only a few hundred pieces of data. – user2637147 Aug 08 '13 at 09:12
  • @x4rf41 The answer on that? I do not use this site, I'm sorry. – user2637147 Aug 08 '13 at 09:14
  • @user2637147 do you have a longtext or longblob column in your table?, if yes, thats the answer to your problem: http://stackoverflow.com/a/18121874/278329 – x4rf41 Aug 08 '13 at 09:16
  • @x4rf41 Have a LONGTEXT items – user2637147 Aug 08 '13 at 09:18
  • @x4rf41 Can you give me a link, I take a look – user2637147 Aug 08 '13 at 09:20
  • @user2637147 i did already, i can do it again: http://stackoverflow.com/a/18121874/278329 you could also scroll a bit around here.... – x4rf41 Aug 08 '13 at 09:21
  • @x4rf41 Thank you very much, it is a problem I have several, I answer queries in this regard. – user2637147 Aug 08 '13 at 09:33
  • The problem is not with what is actually inside the DB, but with the longtext type field, as first memory is reserved and then results are read. – Undry Jan 24 '23 at 13:40
1

Doens't seem like a huge table! Seems like a endless loop! It tries to allocate about 4gb, i dont think you have such a big table....

check that you don't create a loop here:

call_user_func_array (array ($ stmt, 'bind_result'), $ parameters);

maybe you should post the code that is around this line.

0

You are exceeding the maximun available memory. Yow have two options:

  • Increase the maximum allowed memory to each PHP script either by configuration (memory_limit directive in php.ini) or in execution time by using ini_set('memory_limit', '200M')

  • Improve the code to handle only the required information.

vicentazo
  • 1,739
  • 15
  • 19