2

first time posting so sorry if I get anything wrong.

I'm trying to create a secure file download storefront. Actually it works, but only with small file. I have a 1.9gb product to download and it keeps stopping partway through the transfer. Inconsistent sizes too, I've had up to 1gb, but often it is 200-500mb.

The aim is to create a space where only users with a registered account can download the file, so direct link is not possible.

I've read elsewhere on this site that resetting the script timeout within the file read loop should get around the script time limit.

try
{
    $num_bytes = filesize ("products/" . $filename);
    $mp3content = fopen("products/" . $filename, "rb") or die("Couldn't get handle");
    $bytes_read=0;
    if ($mp3content) {
        while (!feof($mp3content)) {
            set_time_limit(30);
            $buffer = fread($mp3content, 4096);
            echo $buffer;
            $bytes_read+=4096;
        }
        fclose($handle);
    }
}
catch (Exception $e)
{
    error_log("User failed to download file: " . $row['FILENAME'] . "(" . $row['MIMETYPE'] . ")\n" . $e, 1, getErrorEmail());
}

error_log("Bytes downloaded:" . $bytes_read . " of " . $num_bytes, 1, getErrorEmail());

I don't receive the final error log email on large files that fail, but I do get the emails on smaller files that succeed, so I know the code works in principle.

holdmykidney
  • 31
  • 1
  • 4
  • Maybe you have "Safe mode" activated? Then `set_time_limit(30);` has simply no effect. – arkascha Nov 15 '14 at 11:10
  • Oh, and a side note: you should prefer the ogg vorbis format over mp3. It is technically superior and more political correct :-) – arkascha Nov 15 '14 at 11:12
  • Agreed but in a commercial setting, the old adage remains about giving the people what they want. Is Safe mode something I can change within the script? This is hosted storage, so I can't configure anything without a support ticket (which is strictly a 9-5 operation!). – holdmykidney Nov 15 '14 at 11:27

3 Answers3

1

Turns out my hosting is the issue. The PHP code is correct, but my shared hosting environment limits all php scripts to 30 seconds, which in the case of the code above, takes about 15 minutes to run its course. Unless someone can come up with a way of keeping PHP tied up in file handling methods which don't contribute to the timer, looks like this one is stuck.

holdmykidney
  • 31
  • 1
  • 4
0

Try this one

set_time_limit(0);
DJ MHA
  • 598
  • 3
  • 18
0

I had the same problem so I thought of a different approach. When file is requested, I make a hard link of the file in a random named directory inside the "download" folder and give the user the link for 4 hours.

File url finishes being like this.

http://example.com/downloads/3nd83js92kj29dmcb39dj39/myfile.zip

Every call to the script parses the "download" folders and delete all folders and their contents that have over 4 hours of creation time to keep the thing clean.

This is not safe for brut force attacks, but can be worked around.

FedeKrum
  • 435
  • 1
  • 6
  • 15