36

Using PHP, I am trying to serve large files (up to possibly 200MB) which aren't in a web accessible directory due to authorization issues. Currently, I use a readfile() call along with some headers to serve the file, but it seems that PHP is loading it into memory before sending it. I intend to deploy on a shared hosting server, which won't allow me to use much memory or add my own Apache modules such as X-Sendfile.

I can't let my files be in a web accessible directory for security reasons. Does anybody know a method that is less memory intensive which I could deploy on a shared hosting server?

EDIT:

if(/* My authorization here */) {
        $path = "/uploads/";
        $name = $row[0];           //This is a MySQL reference with the filename
        $fullname = $path . $name; //Create filename
        $fd = fopen($fullname, "rb");
        if ($fd) {
            $fsize = filesize($fullname);
            $path_parts = pathinfo($fullname);
            $ext = strtolower($path_parts["extension"]);
            switch ($ext) {
                case "pdf":
                header("Content-type: application/pdf");
                break;
                case "zip":
                header("Content-type: application/zip");
                break;
                default:
                header("Content-type: application/octet-stream");
                break;
            }
            header("Content-Disposition: attachment; filename=\"".$path_parts["basename"]."\"");
            header("Content-length: $fsize");
            header("Cache-control: private"); //use this to open files directly
            while(!feof($fd)) {
                $buffer = fread($fd, 1*(1024*1024));
                echo $buffer;
                ob_flush();
                flush();    //These two flush commands seem to have helped with performance
            }
        }
        else {
            echo "Error opening file";
        }
        fclose($fd);
brenns10
  • 3,109
  • 3
  • 22
  • 24
  • 4
    Disagree this is a duplicate, this guy is trying to download through PHP, where-as the other question is just how to download a large file.. different issues honestly. – Richard Lyle Jan 24 '16 at 06:51

4 Answers4

10

If you use fopen and fread instead of readfile, that should solve your problem.

There's a solution in the PHP's readfile documentation showing how to use fread to do what you want.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055
Francois Deschenes
  • 24,816
  • 4
  • 64
  • 61
  • Using a chunked readfile with the `flush()` and `ob_flush()` is kind of working as stated in the documentation. Now I am having trouble with the download just being interrupted near the end of the file. Any idea there? – brenns10 Jun 29 '11 at 22:25
  • 1
    @brenns10 - The example doesn't use `flush()` and `ob_flush()`. Using those will cause the data to go in memory until it's displayed, something you might want to avoid if you have limited resources. With that said, I'm not sure why it skipping the end without seeing your code. It could be that the file's last few bytes aren't read or that you're not flushing after the last read. – Francois Deschenes Jun 29 '11 at 22:47
  • I have tried using the script both with the flush commands and without, and when I use them, the download starts immediately and doesn't use much memory (on my WAMP). When I don't use the flush commands, there is quite a delay before the download starts (which, I assume, is PHP reading the file into memory), and during that time there is a huge RAM spike. I will post the code I am using in the original question. – brenns10 Jun 29 '11 at 23:11
  • @brenns10 - The code seems to be fine. The only thing I should point out is that `ob_flush()` can result in a PHP Notice if there's no buffer to flush. It is possible that it's adding some text at the end of your file and making it invalid. You could add an "@" symbol in front of the `ob_flush()` to suppress the error and see if that solves the problem. – Francois Deschenes Jun 29 '11 at 23:46
  • Thank you for all your help, but as it turns out, I will be doing a different solution using ftp. HTTP just isn't designed to transfer things that big. But thanks again for the help! – brenns10 Jul 09 '11 at 02:56
  • 1
    @brenns10, you could make use of this **PHP download script**. Works for massively large files with great MIME support. [**Large File Download**](http://stackoverflow.com/questions/3176942/using-php-to-download-files-not-working-on-large-files/21354337#21354337) I checked out this by downloading about 2GB file. It works. – webblover Jan 25 '14 at 18:22
  • 1
    Please update/correct this answer. At least currently `readfile` already 'chunks' internally. The only problem that arises is when the output buffer is left on; but such would *any* output - fopen/fread/readfile or otherwise. – user2864740 Jul 07 '15 at 20:33
  • 3
    "a solution" link is dead – user3044394 Aug 06 '20 at 01:15
  • you should quote the sample here, your link is broken – a55 Mar 02 '21 at 10:12
8

To download large files from server, I have changed the below settings in php.ini file:

Upload_max_filesize  - 1500 M
Max_input_time  - 1000
Memory_limit    - 640M
Max_execution_time -  1800
Post_max_size - 2000 M

Now, I am able to upload and download 175MB video on server. Since, I have the dedicated server. So, making these changes were easy.

Below is the PHP script to download the file. I have no made any changes in this code snippet for large file size.

// Begin writing headers

            ob_clean(); // Clear any previously written headers in the output buffer

            if($filetype=='application/zip')
            {
                if(ini_get('zlib.output_compression'))
                    ini_set('zlib.output_compression', 'Off');
                $fp = @fopen($filepath, 'rb'); 
                if (strstr($_SERVER['HTTP_USER_AGENT'], "MSIE"))
                {
                    header('Content-Type: "$content_type"');
                    header('Content-Disposition: attachment; filename="'.$filename.'"');
                    header('Expires: 0');
                    header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
                    header("Content-Transfer-Encoding: binary");
                    header('Pragma: public');
                    header("Content-Length: ".filesize(trim($filepath)));
                }
                else
                {
                    header('Content-Type: "$content_type"');
                    header('Content-Disposition: attachment; filename="'.$filename.'"');
                    header("Content-Transfer-Encoding: binary");
                    header('Expires: 0');
                    header('Pragma: no-cache');
                    header("Content-Length: ".filesize(trim($filepath)));
                }

                fpassthru($fp);
                fclose($fp);
            }
            elseif($filetype=='audio'|| $filetype=='video')
            { 
                global $mosConfig_absolute_path,$my;
                ob_clean();
                header("Pragma: public");
                header('Expires: 0');
                header('Cache-Control: no-store, no-cache, must-revalidate');
                header('Cache-Control: pre-check=0, post-check=0, max-age=0');
                header("Cache-Control: public");    
                header("Content-Description: File Transfer");
                header("Content-Type: application/force-download"); 
                header("Content-Type: $content_type");
                header("Content-Length: ".filesize(trim($filepath)));
                header("Content-Disposition: attachment; filename=\"$filename\"");
                // Force the download           
                header("Content-Transfer-Encoding: binary");            
                @readfile($filepath);
            }
              else{ // for all other types of files except zip,audio/video
                ob_clean();
                header("Pragma: public");
                header('Expires: 0');
                header('Cache-Control: no-store, no-cache, must-revalidate');
                header('Cache-Control: pre-check=0, post-check=0, max-age=0');
                header("Cache-Control: public");    
                header("Content-Description: File Transfer");
                header("Content-Type: $content_type");
                header("Content-Length: ".filesize(trim($filepath)));
                header("Content-Disposition: attachment; filename=\"$filename\"");
                // Force the download           
                header("Content-Transfer-Encoding: binary");            
                @readfile($filepath);       
            }
            exit;
bradym
  • 4,880
  • 1
  • 31
  • 36
ursitesion
  • 988
  • 2
  • 15
  • 26
3

If you care about performance, there is xsendfile, available in apache, nginx and lighttpd as module. Check the readfile() doc's users comments.

There are also modules for these webservers which accept a url with an additional hash value which allows downloading the file for a short time period. This can be also used to solve authorization issues.

NullUserException
  • 83,810
  • 28
  • 209
  • 234
Karoly Horvath
  • 94,607
  • 11
  • 117
  • 176
  • This sounds like a useful alternative, but is this something that I can do on a shared hosting server? I think that the hosting companies have a few set modules that they use, so I probably won't be able to use this on a shared server. – brenns10 Jun 29 '11 at 23:44
  • Good, you have already answered your question :) Ask them.. – Karoly Horvath Jun 29 '11 at 23:50
1

You could also handle this in the style of the Gordian Knot - that is to say, sidestep the problem entirely. Keep the files in a non-accessible directory, and when a download is initiated you can simply

$tempstring = rand();
symlink('/filestore/filename.extension', '/www/downloads' . $tempstring . '-filename.extension');
echo('Your download is available here: <a href="/downloads/' . $tempstring . '-filename.extension">');

and setup a cronjob to unlink() any download links older than 10 minutes. Virtually no processing of your data is required, no massaging of HTTP headers, etc.

There are even a couple libraries out there for just this purpose.

xnim
  • 353
  • 2
  • 9
Winfield Trail
  • 5,535
  • 2
  • 27
  • 43