-1

Today, PHP (CLI) died with this error:

PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 58724352 bytes)

(134217728 = 128 MB and 58724352 = 56 MB.)

The file it tried to download was only 73 MB, which is not anywhere near the 128 MB memory limit.

What does it mean by reaching the memory limit of 128 MB? My script's other code doesn't use anywhere near (128 - 73) MB... more like 2-3 MB tops...

And what does it mean by "tried to allocate 56 MB"? Is that referring to 56 MB over the limit? If so, it still doesn't make any sense since, again, 73 MB is already much less than 128 MB.

By the way, increasing the memory limit to 256 MB "fixed" this, as in it downloaded the file without dying or logging any errors, but I'd still like to understand what it means and why it failed when 128 MB should've been more than enough.

PS: Yes, I know that I'm supposed to download large files in chunks, instead of keeping it all in RAM until finished, but the code to do that is simply too complex and annoying* to implement, so right now, I'm just asking about what PHP means by that error and this behaviour.

(*I have put it in my massive "to do" list...)

  • it's `58724352` above the limit, you need to show your code, it tried to allocate `1 - 58724352` more than `134217728` – Flash Thunder Aug 22 '19 at 18:30
  • 3
    @Flash No, it tried to allocate 58724352 bytes and in the process crossed the limit of 134217728 bytes. It doesn’t say by how much. – deceze Aug 22 '19 at 18:31
  • Marked as a duplicate before I could even answer. Helpful. The problem is very likely that you're inadvertently duplicating the data in your code. This can occur, for example, if you pass the data into a function call without passing it by reference. The data is copied during the function call, resulting in more memory being used than you expect. This can also occur if you do `$other_variable = $data_containing_variable`. – B. Fleming Aug 22 '19 at 18:32
  • @deceze you answered my comment before I edited it few seconds after posting (and your answer was dated 1 minute after), so your comment is not valid anymore, whoever upvoted it after another 1 minute simply can't read with understanding or is a really slow writer – Flash Thunder Aug 22 '19 at 18:32
  • @B.Fleming Data is not being copied merely by passing it into a function. PHP implements copy-on-write, which only copies data *when necessary* because it’s being modified. – deceze Aug 22 '19 at 18:33
  • @deceze We're splitting hairs here. I'm not trying to get into the nitty gritty details about how PHP handles data in memory. If that data is being passed into a function call or otherwise assigned to a different variable, then write actions are likely occurring somewhere. The point is to get the OP to look for cases where data might be being duplicated. – B. Fleming Aug 22 '19 at 18:35
  • This is perfectly well answered in https://stackoverflow.com/q/9432810/476. – deceze Aug 22 '19 at 18:35
  • Possible duplicate of [Fatal error: Allowed memory size in PHP when allocating less](https://stackoverflow.com/questions/9432810/fatal-error-allowed-memory-size-in-php-when-allocating-less) – csabinho Aug 24 '19 at 23:44

1 Answers1

1

What does it mean by reaching the memory limit of 128 MB?

It means you have configured a limit of 128 Mb per PHP instance - and your script could not run in this.

Is that referring to 56 MB over the limit?

yes

My script's other code doesn't use anywhere near (128 - 73) MB... more like 2-3 MB tops...

The file it tried to download was only 73 MB, which is not anywhere near the 128 MB memory limit.

The data it read was probably base64 encoded - which adds an overhead. The decoded data may exist in memory before being writtten to disk.

I know that I'm supposed to download large files in chunks, instead of keeping it all in RAM until finished

No, but giving each PHP process ridiculous amounts of memory is not the right way to solve the problem either.

but the code to do that is simply too complex and annoying* to implement

`curl --output $localfile $url`;

Too complicated?

Community
  • 1
  • 1
symcbean
  • 47,736
  • 6
  • 59
  • 94
  • I tried running constant memory usage functions while the file was downloaded. The RAM usage never goes beyond a couple of megs beyond the 73 MB file size... So I have no idea what caused it to fail. –  Aug 22 '19 at 19:08
  • 1
    And I have no idea what you mean by that command line curl command... I'm using cURL in PHP. All kinds of complex options are set in my wrapper function. I can't just run such a terminal command and have it use the defaults for everything... –  Aug 22 '19 at 19:09