4

I have some simple trying to write from one file to another.

$content=file_get_contents("C:\Users\Borut\Desktop\sql_PW1\mm_ads (1).sql");
file_put_contents("C:\Users\Borut\Desktop\sql_PW1\New", $content);

The file which I read is about 80M big, the memory limit in php is 128M, but I get an error:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 80739522 bytes)

So the memory is exhausted even though the memory I am trying to allocate is actually smaller?? I can make this work by increasing the memory limit, but I want to know why it doesnt work with this limit.

Borut Flis
  • 15,715
  • 30
  • 92
  • 119
  • The error message says you exhausted the available 128 Megabytes while trying to allocate 76.99 Megabytes. – ficuscr Feb 05 '13 at 17:12
  • Does the error still occur if the second line is not there? i.e. Is the error occurring on the first or second line? – leftclickben Feb 05 '13 at 17:21

4 Answers4

6

The amount that it shows you in tried to allocate xxxx bytes is the amount over and above the memory limit in PHP. This means you have exhausted your 128MB while you were trying to allocate an additional ~80MB.

Even if you can fit the file into memory, when you know the file is going to be that large, it will be a lot better for you to use a combination of fopen/fread/fwrite/fclose.

I assume that you're going more than just reading the contents and writing it to another file, though, right? Because if that's all you need, you can just use the copy function.

Colin M
  • 13,010
  • 3
  • 38
  • 58
  • The question here is why 128 MB exhausted? –  Feb 05 '13 at 17:15
  • Yeah I am trying to do something more than just copy the files, it is just weird that memory is exhausted, if the file is smaller. – Borut Flis Feb 05 '13 at 17:19
  • Well, yes, that would be because PHP allocates a write buffer for the contents in `file_put_contents` in addition to the memory taken from the return of `file_get_contents`. – Colin M Feb 05 '13 at 17:21
  • That is a valid explanation, because without the file_put_contents it doesnt exhaust the memory. – Borut Flis Feb 05 '13 at 17:22
  • @BorutFlis if you posted more of your code we might be able to suggest a less memory-intensive way of doing whatever it is you are doing. – Sammitch Feb 05 '13 at 17:45
2

To download large files without running out from memory try the following function:

function custom_put_contents($source_url='',$local_path=''){

    $time_limit = ini_get('max_execution_time');
    $memory_limit = ini_get('memory_limit');

    set_time_limit(0);
    ini_set('memory_limit', '-1');      

    $remote_contents=file_get_contents($source_url);
    $response=file_put_contents($local_path, $remote_contents);

    set_time_limit($time_limit);
    ini_set('memory_limit', $memory_limit); 

    return $response;
}

Resources

Allowed memory size of 134217728 bytes exhausted

Community
  • 1
  • 1
RafaSashi
  • 16,483
  • 8
  • 84
  • 94
1

First guess is that file_get_contents() is allocating 80M of memory and file_put_contents() is trying to allocate another 80M. You can try to put

var_dump(memory_get_usage())

between the lines.

note that this error message is not telling how many memory is actually in use:

Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 80739522 bytes)

So it can mean that PHP is actually using 100M and trying to allocate another 80M

And of course for such a large files you can use fopen() fwrite() and not put the whole file in memory at once.

Also if you just want to copy those files you could simply use copy()

fsw
  • 3,595
  • 3
  • 20
  • 34
1

Loading an entire file into PHP can very well exhaust your memory. If you need to work with a file that is too large for your memory, you'll have to load it a bit at a time, which means you can't use file_get_contents().

But if all you're doing is copying the file, there's really no need to load it at all -- PHP does provide a copy() function. Job done, without needing to load it at all.

SDC
  • 14,192
  • 2
  • 35
  • 48