There is an internet speed limit in our school about 7kb/s, and we have some php hosts with 300Mb space. But a realized that there are no limits in our php scripts!
I wanted to download a file to my host and i did found an answer in this site.
Well here is the problem: I can't download more than 300 Mb and also I want to be able to use my friend's account to speed up my downloads (with their permission of course).
I have full access to a computer (in school) that can move my less_then_300Mb_files to itself and let me download the rest.
To start i wrote this code:
<?php
function downloadfile ($url, $path){
$step=1024;
$fsize=0;
if(file_exists($path))
$fsize=filesize($path);
$dlsize=filesize($url);
echo "$fsize<br>$dlsize<br>";
$file = fopen ($url, "rb");
if ($file)
{
$newf = fopen ($path, "ab");
fseek($newf,$fsize);
fseek($file,$fsize);
if($newf)
while(ftell($newf)<$dlsize)
{
if($dlsize-ftell($newf)<$step)
fwrite($newf, fread($file, $dlsize-ftell($newf)), $dlsize-ftell($newf));
else
fwrite($newf, fread($file, $step), $step );
}
}
if ($file)
fclose($file);
if ($newf)
fclose($newf);
}
$url=$_GET['url'];
$exp=explode("/",$url);
downloadfile($url,$exp[count($exp)-1]);
?>
It works great on local files but fseek
& some other commands are not functioning well in URLs.
I want it to be like IDM, be able to resume and pause a file. Also, if possible, i want it to download using multiple requests.
whats wrong with my function and how do i download a part of the file?
more information:
server is linux based
the computer i have access to is win7.
i know vb.net, php(!) and a bit c++.