0

I am looking to limit download speed of a file, I have found the below thread on here which works wonders for locally stored files, however my files are stored on an external server and I'm not entirely sure how to make this work with said server.

Reference: Limit download speed using PHP

Code:

<?php
set_time_limit(0);

// local file that should be send to the client
$local_file = 'https://remoteserver.com/example.mp4';
// filename that the user gets as default
$download_file = 'https://remoteserver.com/example.mp4';

// set the download rate limit (=> 20,5 kb/s)
$download_rate = 1024; 
if(file_exists($local_file) && is_file($local_file)) {
    // send headers
    header('Cache-control: private');
    header('Content-Type: application/octet-stream'); 
    header('Content-Length: '.filesize($local_file));
    header('Content-Disposition: filename='.$download_file);

    // flush content
    flush();    
    // open file stream
    $file = fopen($local_file, "r");    
    while(!feof($file)) {

        // send the current file part to the browser
        print fread($file, round($download_rate * 1024));    

        // flush the content to the browser
        flush();

        // sleep one second
        sleep(1);    
    }    

    // close file stream
    fclose($file);}
else {
    die('Error: The file '.$local_file.' does not exist!');
}




if ($dl) {
} else {
    header('HTTP/1.0 503 Service Unavailable');
    die('Abort, you reached your download limit for this file.');
}
?>

Both servers share the same base domain, just different subdomains. I could limit download speed using httpd configuration on the remote server, however I am looking to make different speeds for different user permissions and this would just result in an overall limitation.

My Solution:

I have used httpd / apache2 config to limit download speeds to a specified URL for example https://remoteserver.com/slow/... and https://remoteserver.com/fast/... using the below.

<Location "/slow">
    SetOutputFilter RATE_LIMIT
    SetEnv rate-limit  5120
    SetEnv rate-initial-burst 5120
</Location>

<Location "/fast">
    SetOutputFilter RATE_LIMIT
    SetEnv rate-limit  204800
    SetEnv rate-initial-burst 204800
</Location>

I then used .htaccess write to include that part of the URL for me as my file structure already existed and couldn't be split into subfolders.

RewriteEngine On
RewriteRule ^/?(?:slow|fast)(/downloads/.+)$ $1 [L,NC]
  • Setup a download script that gets the file from the other server and streams it through your script. – Jared Farrish Jan 12 '23 at 12:22
  • I had thought of this but local storage is sparse, whereas external server has a large capacity. This wouldn't work with a large number of users nor would it reduce outgoing speed if it has to download it at full speed first to get it on the local server. – Paulamonopoly Jan 12 '23 at 12:26
  • Put the download script on the other server and accept an argument that toggles the speed that you can verify. One way would be encrypting a value with a private key that you can decrypt and validate (a JWT for instance). – Jared Farrish Jan 12 '23 at 12:35
  • Are you sharing session cookies between both domains? And how many download speeds do you need? What about having multiple subdomains with different download speeds? I think that using PHP is really not a good idea. You'll have a lot of load, memory use and "lock" some PHP processes for that. Redirecting to the correct subdomain may be possible with a bit of logic in the HTTPD/NGINX config. But as @JaredFarrish says, you might have to use encrypted params such as a speed limit, a timestamp and a salt to avoid users from copy-pasting the param. – Patrick Janser Jan 12 '23 at 12:44
  • Currently my script uses session id as a download token to hopefully lock downloads to that specific user, the remote server also has a .htaccess which only allows access / downloads from the referred URL. I was hoping this script above could be easily be manipulated for what I needed as the entire script is very basic. I was looking for only two different speeds for two different users types, full speed for premium and limited for none premium / guests. My PHP can already determine which user belongs to which group. – Paulamonopoly Jan 12 '23 at 12:48
  • 1
    Then the script can generate a key to include in a url that can be independently validated. – Jared Farrish Jan 12 '23 at 13:04
  • Also, as @PatrickJanser points out, if you're slowing some requests, e.g. they take longer, you're more likely to have more connections, which can cause it's own issues. – Jared Farrish Jan 12 '23 at 13:08
  • I'm thinking the solution is to have subdomain1.example.com limited and subdomain2.example.com to be full speed both virtualservers on remote server to use the same file paths but then different speed params in their httpd conf. – Paulamonopoly Jan 12 '23 at 13:30
  • I have edited my question to include my solution, if someone would like to type of my solution as answer so I can select it be my guest! Thanks for the help Jared Farrish and Patrick Janser – Paulamonopoly Jan 12 '23 at 15:30

0 Answers0