2

There is an internet speed limit in our school about 7kb/s, and we have some php hosts with 300Mb space. But a realized that there are no limits in our php scripts!

I wanted to download a file to my host and i did found an answer in this site.

Well here is the problem: I can't download more than 300 Mb and also I want to be able to use my friend's account to speed up my downloads (with their permission of course).

I have full access to a computer (in school) that can move my less_then_300Mb_files to itself and let me download the rest.

To start i wrote this code:

<?php
function downloadfile ($url, $path){
  $step=1024;
  $fsize=0; 
  if(file_exists($path))
    $fsize=filesize($path);
  $dlsize=filesize($url);
  echo "$fsize<br>$dlsize<br>";

  $file = fopen ($url, "rb");
  if ($file)
  {
$newf = fopen ($path, "ab");
fseek($newf,$fsize);
fseek($file,$fsize);
if($newf)
    while(ftell($newf)<$dlsize)
    {
        if($dlsize-ftell($newf)<$step) 
            fwrite($newf, fread($file, $dlsize-ftell($newf)), $dlsize-ftell($newf));
        else
            fwrite($newf, fread($file, $step), $step );
    }

} 

if ($file) 
  fclose($file);


if ($newf) 
  fclose($newf);

}


$url=$_GET['url'];
$exp=explode("/",$url);


downloadfile($url,$exp[count($exp)-1]);

?>

It works great on local files but fseek & some other commands are not functioning well in URLs.

I want it to be like IDM, be able to resume and pause a file. Also, if possible, i want it to download using multiple requests.

whats wrong with my function and how do i download a part of the file?

more information: server is linux based
the computer i have access to is win7.
i know vb.net, php(!) and a bit c++.

Community
  • 1
  • 1
aliqandil
  • 1,673
  • 18
  • 28
  • 1
    the server you are downloading the file from has to support partial file downloads. This is not supported by all servers. Depending on how your bandwidth is managed, i.e. if there is a proxy in between, that may also break partial downloads. Try testing using a server you know supports partial downloads if you can. – drone.ah Feb 10 '13 at 16:04
  • how can i check if server supports partial file downloads? (by the way, server can download files, but there is no guarantee that its no damaged!, i downloaded 210 Mb of OPENCV!, and the rest 40 Mb by idm, in my home!) – aliqandil Feb 10 '13 at 16:13
  • that is a good question. The only way I can think of is to use a browser that supports resuming a download, then start downloading a file, stop it and resume. If it resumes correctly, then the server supports partial downloads. I am pretty sure that Chrome/Chromium has support for this. – drone.ah Feb 10 '13 at 16:20
  • oh! got it, yes i actualy check my files before giving them to script. my way to check if a file supports partial file downloads, is to check if idm can download it with more than one connection, i just didn't know it was partial download! thanks! – aliqandil Feb 10 '13 at 16:25

1 Answers1

0

Ok, I did somethings by myself. It's not exactly what I wanted , but will do the trick somehow:

<?php
set_time_limit(0);

function existsin($ar,$key)
{
    foreach ($ar as $s => $val)
        if ($s==$key) return true; 
    return false;
}

function toinf($str,$path)
{
$fhr=fopen ("$path/inf.txt", "w");
fwrite($fhr,$str);
fclose($fhr);
}


function downloadfile ($url, $path, $len, $num, $waittl)
{
    $exp=explode("/",$url);
    $fname=$exp[count($exp)-1];
    $prt=0;$ppath=$path;
    $sp=1;$exectime=10000;
    $file = fopen ($url, "rb");

    if ($file)
    {
        $newf = fopen ($path, "wb");
        if ($newf)
            while(!feof($file))
            {
                if(ftell($newf)/1048576 >= $len && $len!=0)
                {


                while(file_exists($ppath) && $waittl)
                    usleep(5000);

                $prt=$prt+1;
                fclose($newf);
                $ppath=$path.".part".$prt;
                if(is_dir($num))
                    $newf = fopen ($ppath, "wb");           
                else
                    die("rested!");
                }
                toinf("$fname|Downloading",$num);

            $timepre = microtime(true);
            fwrite($newf, fread($file, 1024 * $sp ), 1024 * $sp );
            $timepost = microtime(true);
            $exectime2= $exectime;
            $exectime = $timepost - $timepre;
            if($exectime2 > $exectime && 65536>$sp) {$sp=$sp*1.5;} else if($sp>=1) $sp=$sp/1.5;
            if($sp<=0) $sp=1;


            }
    } 

    if ($file) 
        fclose($file);


    if ($newf) 
        fclose($newf);

}


if(existsin($_GET,'url')==false) {echo("invalid arg");return 0;} else echo ("started.");
$url=$_GET['url'];
$exp=explode("/",$url);

for($i=1;is_dir($i);$i++);
mkdir($i);
toinf($exp[count($exp)-1]."|Started",$num);
$maxlen=50;$waittl=true;
if(existsin($_GET,'maxlen')==true) $maxlen=$_GET['maxlen'];
if(existsin($_GET,'wait')==true) $waittl=false; 
downloadfile($url,$i."/".$exp[count($exp)-1],$maxlen,$i,$waittl);
toinf($exp[count($exp)-1]."|Done",$i);
?>

I'm not able to resume a download yet, but I can download it part by part and pass the school space filter... But sometimes (I don't know for what reason) PHP suddenly terminates and leaves me nothing but bunch of useless parts!

aliqandil
  • 1,673
  • 18
  • 28