0

I'm getting the following error: Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 36 bytes)

The line that it is failing on is while (($dataInputRecord = fgetcsv($inputFile, $maxRowLength, ",")) !== FALSE) from this block of code:

    if (($inputFile = fopen($fileName, "r")) !== FALSE) {
        while (($dataInputRecord = fgetcsv($inputFile, $maxRowLength, ",")) !== FALSE) {
            // do stuff
        }
        fclose($inputFile);
    }

The entire file is 32MB. Playing around with the size of $maxRowLength didn't help. I'm convinced that the problem is the file size, because truncating it to 8 KB solves the problem. How can I make it not care about file size?

Edit: Looks like my boss was misleading me about what the code within the loop was supposed to do

cck
  • 193
  • 1
  • 14
  • Memory is never infinite, so no there is no universal solution that will work with any csv file size. There's a couple of workarounds, you could start with http://stackoverflow.com/questions/17520093/read-large-data-from-csv-file-in-php – fvu Feb 06 '17 at 18:09
  • I'm not wanting to load the entire file to memory. The only variables that should persist through more than one iteration of the loop should be a relatively small array of the rows that meet certain criteria and the handle for the input file. – cck Feb 06 '17 at 18:28
  • 3
    well, i'd bet (with very high stakes) that the actual *problem* is whatever you omitted at `// do stuff` - may we have a look at it? – Franz Gleichmann Feb 06 '17 at 18:41
  • adding some calls to [memory_get_usage](http://php.net/manual/en/function.memory-get-usage.php) may help you understand why `// do stuff`is taking up more memory than you expect. – fvu Feb 06 '17 at 18:43

2 Answers2

0

You could increase memory limit by changing memory_limit in php.ini

0

Well try ini_set('memory_limit', '256M');

134217728 bytes = 128 MB

Niv Asraf
  • 267
  • 1
  • 10
  • 2
    be advised: that does not *solve* the problem, but only ease the symptoms. this will still fail for large files, they're just a bit larger than now. – Franz Gleichmann Feb 06 '17 at 18:48