5

I made a download script in PHP that was working until yesterday. Today I tried to download one of the files only to discover that suddenly it stopped working:

PHP Fatal error: Allowed memory size of 67108864 bytes exhausted (tried to allocate 119767041 bytes) in E:\home\tecnoponta\web\aluno\download.php on line 52

For some reason PHP is trying to allocate the size of the file in the memory, and I have no idea why. If the file size is smaller than the memory limit, I can download it without a problem, the problem is with bigger files.

I do know that it can be corrected by increasing the memory limit in php.ini or even use ini_set on the code but I would like a more accurate way to fix this and an answer to why it stopped working.

Here's my code:

$file = utf8_decode($_GET['d']);

header('Content-Description: File Transfer');
header('Content-Disposition: attachment; filename='.$file);
header('Content-Type: application/octet-stream');
header('Content-Transfer-Encoding: binary');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');

$file = "uploads/$curso/$file";

ob_clean();
flush();
readfile($file);

echo "<script>window.history.back();</script>";
exit;
user2864740
  • 60,010
  • 15
  • 145
  • 220
bruno
  • 85
  • 1
  • 3
  • 2
    It's pretty simple why it stopped working, PHP ran out of memory while attempting to read the file you're telling it to send to the client. You should try sending it down in chunks. This might help you. http://stackoverflow.com/a/6527829/1729859 – mituw16 Jul 07 '15 at 19:34
  • Have you tried to see if you haven't got nested buffering for some reason by using ob_get_level() http://php.net/manual/en/function.ob-get-level.php – asiop Jul 07 '15 at 19:34
  • 1
    Also, try: ob_end_flush() instead of ob_clean() – asiop Jul 07 '15 at 19:38
  • @mituw16 readfile will itself 'chunk' through the file; the problem is the buffer between it and the connected client. (A manual chunking loop would not fix that issue as such would also mash up against the buffer; the output buffering needs to be terminated.) – user2864740 Jul 07 '15 at 19:44
  • I've tried ob_end_flush() before and it actually works, but the files get corrupted... – bruno Jul 07 '15 at 19:48
  • @bruno Because ob_end_flush() will send contents of a buffer to the output, so it is like writing some data to the start of a file. See my answer, you need ob_end_clean() – Clickbeetle Jul 07 '15 at 19:57

4 Answers4

22

From php.net for readfile:

readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().

From php.net for ob_clean:

This function does not destroy the output buffer like ob_end_clean() does.

What you need is this:

if (ob_get_level()) {
      ob_end_clean();
    }

Also consider adding this:

header('Content-Length: ' . filesize($file));
Clickbeetle
  • 629
  • 4
  • 13
  • Why not use [`ob_end_flush()`](http://php.net/manual/en/function.ob-end-flush.php) instead of [`ob_end_clean()`](http://php.net/manual/en/function.ob-end-clean.php)? – Aalex Gabi Jun 15 '18 at 10:10
  • @AalexGabi Well you can, if you want to send the content of buffer to client. Usually in case of file transfering it can lead to mistakes, because you will send yo user some data, that is not part of file. It just was in your output buffer. – Clickbeetle Sep 04 '18 at 13:05
  • Even with this snippet in place we are facing OOM exceptions. Could the Symfony StreamedResponse and NewRelic APM Agent have something to do with this? – Rvanlaak Jan 24 '23 at 20:40
9

The ob-level solution resulted in always an empty output.

This solution works for me:

<?php


$myFile = $myPath.$myFileName;

if (file_exists($myFile)) {


header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$myFileName.'"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($myFile));

//alternative for readfile($myFile);

$myInputStream = fopen($myFile, 'rb');
$myOutputStream = fopen('php://output', 'wb');

stream_copy_to_stream($myInputStream, $myOutputStream);

fclose($myOutputStream);
fclose($myInputStream);

exit;

} else {

echo "*** ERROR: File does not exist: ".$myFile;

}

?>

So using stream_copy_to_stream() instead of fileread()

Hope this can help you.

Muhammad Hassaan
  • 7,296
  • 6
  • 30
  • 50
Al-Noor Ladhani
  • 2,413
  • 1
  • 22
  • 14
0

I solved like this:

if(file_exists($filepath)){

  header('Content-Description: File Transfer');
  header('Content-Type: application/octet-stream');
  header('Content-Disposition: attachment; filename='.basename($filepath));
  header('Content-Transfer-Encoding: binary');
  header('Expires: 0');
  header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
  header('Pragma: public');
  header('Content-Length: ' . filesize($filepath));

  if(ob_get_level()){
     ob_end_clean();
  }

  readfile($filepath);
}
Pinonirvana
  • 920
  • 1
  • 8
  • 12
-4

The size of the file is greater than the php memory limit.

Do

ini_set('memory_limit','16M');

And set the '16M' part to at least the filesize.

Jaime
  • 1,402
  • 7
  • 15
  • 2
    I'm downvoting this because while setting a higher memory limit would fix OPs immediate problem, a better solution would be to read the file in chunks which could handle file sizes to n degree vs a static hard limit. – mituw16 Jul 07 '15 at 19:35
  • 2
    Also, according to the PHP manuals readfile does not take memory the size of the file it's outputting, when a memory problem occur it is because of PHP output buffer expanding. So setting a high memory limit is not a good way to solve this. – asiop Jul 07 '15 at 19:40