3

Information

There are many ways to download files in PHP, file_get_contents + file_put_contents, fopen, readfile and cURL.

Question?

  • When having a large file, let's say 500 MB from another server / domain, what is the "correct" way to downloaded it safe? If connection failes it should find the position and continue OR download the file again if it contains errors.
  • It's going to be used on a web site, not in php.exe shell.

What I figured out so far

  • I've read about AJAX solutions with progress bars but what I'm really looking for is a PHP solution.
  • I don't need to buffer the file to a string like file_get_contents does. That probably uses memory as well.
  • I've also read about memory problems. A solution that don't use that much memory might be prefered.

Concept

This is sort of what I want if the result is false.

function download_url( $url, $filename ) {
    // Code
    $success['success'] = false;
    $success['message'] = 'File not found';
    return $success;
}
Jens Törnell
  • 23,180
  • 45
  • 124
  • 206
  • I've personally had good results using cURL, are you downloading from a remote place, or from the same server as your script? – naththedeveloper Jun 06 '13 at 18:04
  • 1
    I always use cURL as well. Im aware of the others but it always seemed silly to use them given the feature set of the cURL extension. Aslo does this download operation in the context of processing an http request, or is this something invoked from the command lined or otherwise spawned in a process? – prodigitalson Jun 06 '13 at 18:05
  • Also, there are a good number of other [SO](http://stackoverflow.com/questions/3176942/using-php-to-download-files-not-working-on-large-files) [answers](http://stackoverflow.com/questions/15129523/downloading-large-files-using-php) [for](http://stackoverflow.com/questions/6527811/how-to-download-large-files-through-php-script) [this](http://stackoverflow.com/questions/597159/sending-large-files-reliably-in-php) – naththedeveloper Jun 06 '13 at 18:06
  • @FDL From another server / domain. – Jens Törnell Jun 06 '13 at 18:08
  • @prodigitalson It's going to be used on a web site (backend). I updated my question with this info now. – Jens Törnell Jun 06 '13 at 18:11
  • possible duplicate of [How download big file using PHP (low memory usage)](http://stackoverflow.com/questions/4000483/how-download-big-file-using-php-low-memory-usage) – PeeHaa Jun 06 '13 at 21:03

2 Answers2

3

The easiest way to copy large files can be demonstrated here Save large files from php stdin but the does not shows how to copy files with http range

$url = "http://REMOTE_FILE";
$local = __DIR__ . "/test.dat";

try {
    $download = new Downloader($url);
    $download->start($local); // Start Download Process
} catch (Exception $e) {
    printf("Copied %d bytes\n", $pos = $download->getPos());
}

When an Exception occur you can resume the file download for the last point

$download->setPos($pos);

Class used

class Downloader {
    private $url;
    private $length = 8192;
    private $pos = 0;
    private $timeout = 60;

    public function __construct($url) {
        $this->url = $url;
    }

    public function setLength($length) {
        $this->length = $length;
    }

    public function setTimeout($timeout) {
        $this->timeout = $timeout;
    }

    public function setPos($pos) {
        $this->pos = $pos;
    }

    public function getPos() {
        return $this->pos;
    }

    public function start($local) {
        $part = $this->getPart("0-1");

        // Check partial Support
        if ($part && strlen($part) === 2) {
            // Split data with curl
            $this->runPartial($local);
        } else {
            // Use stream copy
            $this->runNormal($local);
        }
    }

    private function runNormal($local) {
        $in = fopen($this->url, "r");
        $out = fopen($local, 'w');
        $pos = ftell($in);
        while(($pos = ftell($in)) <= $this->pos) {
            $n = ($pos + $this->length) > $this->length ? $this->length : $this->pos;
            fread($in, $n);
        }
        $this->pos = stream_copy_to_stream($in, $out);
        return $this->pos;
    }

    private function runPartial($local) {
        $i = $this->pos;
        $fp = fopen($local, 'w');
        fseek($fp, $this->pos);
        while(true) {
            $data = $this->getPart(sprintf("%d-%d", $i, ($i + $this->length)));

            $i += strlen($data);
            fwrite($fp, $data);

            $this->pos = $i;
            if ($data === - 1)
                throw new Exception("File Corupted");

            if (! $data)
                break;
        }

        fclose($fp);
    }

    private function getPart($range) {
        $ch = curl_init();
        curl_setopt($ch, CURLOPT_URL, $this->url);
        curl_setopt($ch, CURLOPT_RANGE, $range);
        curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
        curl_setopt($ch, CURLOPT_TIMEOUT, $this->timeout);
        $result = curl_exec($ch);
        $code = curl_getinfo($ch, CURLINFO_HTTP_CODE);
        curl_close($ch);

        // Request not Satisfiable
        if ($code == 416)
            return false;

            // Check 206 Partial Content
        if ($code != 206)
            return - 1;

        return $result;
    }
}
Community
  • 1
  • 1
Baba
  • 94,024
  • 28
  • 166
  • 217
  • I can't download following file http://www.biart7.com/dh_demo.zip with your Downloader. I got "File Corupted" exception. – alexanoid Nov 08 '13 at 13:52
  • Mostly likely your server does not support range – Baba Nov 08 '13 at 14:11
  • 1
    this approach works.. but its very slow. what does $length do ? and is there anyway to super speed up this ? i am using a shared server. it has about 1gb ram – shakee93 Nov 28 '15 at 17:14
1

You'd want to download the remote file in chunks. This answer has a great example:

How download big file using PHP (low memory usage)

Community
  • 1
  • 1
Rob W
  • 9,134
  • 1
  • 30
  • 50