0

So I have a database table that contains 3500 elements. And when I try to display this data, I get this message :

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 77 bytes) in C:\Program Files\EasyPHP-5.3.8.0\www\Symfony\vendor\doctrine\lib\Doctrine\ORM\Mapping\ClassMetadata.php on line 344

Touki
  • 7,465
  • 3
  • 41
  • 63
MarGa
  • 711
  • 3
  • 10
  • 23

2 Answers2

1

I think you're looking for an Iterator.
This article will show you how to handle big datas.

$q = $this->_em->createQuery("<DQL to select the objects I want>");
$iterableResult = $q->iterate();
while (($row = $iterableResult->next()) !== false) {
        // do stuff with the data in the row, $row[0] is always the object
        $this->_em->detach($row[0]); // detach from Doctrine, so that it can be GC'd immediately
 }

This code, instead of loading all the datas in a single array, creates an iterator to prevent that big loading and this kind of error.

However, if you're just looking on how to increase the memory size of PHP take a look at this answer

ini_set('memory_limit', '1G');
Community
  • 1
  • 1
Touki
  • 7,465
  • 3
  • 41
  • 63
0

Google for "pagination" :) Usually there is never a reason to fetch every entity at once, because (for example) who wants to wait for and read a huge list with 3500 entries on a single page? If you really need them (for example for further processing) read and process them in chunks.

btw: It's not S2s fault, that the memory is limited :)

KingCrunch
  • 128,817
  • 21
  • 151
  • 173
  • Actually I'm using the Jquery plugin Datatables that displays a precised number of lines at once using pagination. So what memory is limited ? and how to extend it (if it's possible) ? – MarGa Mar 15 '13 at 08:39
  • Jquery can't access the database itself, thus you are probably able to do such optimizations in the backend, aren't you? ;) – KingCrunch Mar 15 '13 at 08:50