6

I have tried all the headers. With and without content-length.

The problem is that the dowload craps out around halfway. But only most of the time.

Sometimes it works just fine.

Server resource usage is trivial. All config files are fine. I dont run php script while its going.

Has anyone seen this before? Is it not even my fault? Why sometimes?

$file = "http://domain.com/files/".$item_number.".mov";
header( 'Pragma: no-cache' );
header( 'Cache-Control: must-revalidate, post-check=0, pre-check=0' );
header( 'Content-Description: File Download' );
header( 'Content-Type: application/force-download');
//header ('Content-Type: video/quicktime');//doesnt seem to download at all here
header( 'Content-Length: '.filesize( $file ) );
header( 'Content-Disposition: attachment; filename="'.basename( $file ).'"' );
header( 'Content-Transfer-Encoding: binary' );
readfile( $file ); 
exit();

Thank you, Sorry it's my first time.

patinyo
  • 71
  • 3
  • 4
    Is this download being made trough a PHP script? Can you show us the code? – Igor Azevedo Jun 13 '12 at 00:16
  • 3
    Stop f***ing closing questions just because they're asked by a newbie. This is 100% a real question. – Ali Jun 13 '12 at 00:17
  • Have you had a look at your log files? PS - Code please. – Ayush Jun 13 '12 at 00:22
  • I cant find anything suspicious in the logs, but i put some code – patinyo Jun 13 '12 at 00:38
  • 1
    Is the file actually hosted on another domain or can you serve the file directly from the server rather than over HTTP? Essentially you are proxying the request, PHP has to make a request to `http://domain.com/files/xxx.mov` and at the same time transfer it to the user, using 2x the bandwidth (on your server). That may explain some of the issues too. Maybe also use [set_time_limit](http://php.net/set_time_limit) in the event that is an issue. – drew010 Jun 13 '12 at 00:56
  • Thanks drew. but when i tried to get rid of the http, it downloads one of those empty files. I moved the file into the same directory and called it $file = $item.".mov"; – patinyo Jun 13 '12 at 01:22

1 Answers1

3

This can have several reasons.

You are saying that server resources are OK, but I think PHPs readfile() loads the entire file into memory before sending it. Fortunately there is a good solution in the docs you should definitely use to prevent running into problems with multiple downloads:)

Also, did you set your script execution time to infinite?

Next would be your server settings. Maybe they close the connection for some timeout/resource hog reason.

Use for instance Google Chromes developer tool to closely examine the headers you are sending. PHP sets some headers on its own. You should put session_cache_limiter(false); in the beginning of your script. If I remember correctly, even Expires: Thu, 19 Nov 1981 08:52:00 GMT... Why this is bad? Coming up next;)

You can send the Etag and Last-Modified header. I sum it is a unique identifier for your resource, that makes it possible to resume downloads. I think looking up good articles about this is better than just giving you some sentences here. In your case you could use PHPs stat() function to create an Etag.

There are some people struggling with headers and downloads, you could start (or fix your case) here

Summary:

  • Check your PHP/server expiry/limitations/resource settings
  • Be sure to send the file buffered
  • Make it possible to resume downloads
  • Check the headers you are sending

After this you should be fine. With resuming even if it should still break...

Good progress and success!

nico gawenda
  • 3,648
  • 3
  • 25
  • 38
  • 1
    From the manual "readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level()." – Matthew Jun 13 '12 at 01:35
  • Maybe shared host or something else - I just summarized all that I know that could be the reason:) readfile() will not present an memory issue *IF*... It is definitely one of the list. – nico gawenda Jun 13 '12 at 01:49
  • http://stackoverflow.com/questions/6627952/why-does-readfile-exhaust-php-memory server crap, multiple users, essentially what I said - and main thing still is to eliminate resource hogs like this – nico gawenda Jun 13 '12 at 02:13
  • Those comments are quite frankly incorrect. If you disable output buffering, `readfile()` goes straight from one stream to the next. It does not use lots of memory. Something like `echo file_get_contents($filename)` *would* always use a lot of memory. – Matthew Jun 13 '12 at 02:30
  • "Check your PHP/server expiry/limitations/resource settings". I did not recommend `file_get_contents()`. I recommended to check for possible problems with std settings in this case. A check like you did:) Saying "incorrect" using "if" is not the best thing, just as sidenote. Neither you or me know anything about the config or possibilities, so best is to give thoughts what COULD be the reason, instead of neglecting possibilities because of... possibilities. – nico gawenda Jun 13 '12 at 02:42