3

I need php script for resumable file download from url to server. It should be able to start download, then when it snaps (30 sec- 5 min) resume, and so on until it completes whole file.

There is something similar in perl http://curl.haxx.se/programs/download.txt , but I want to do it in php, I don't know perl.

I think using CURLOPT_RANGE to download chunks, and fopen($fileName, "a") to append it to file on server.

Here is my try:

<?php

function run()
{
    while(1)
    {
         get_chunk($_SESSION['url'], $_SESSION['filename']);
         sleep(5);
         flush();
    }    
}

function get_chunk( $url, $fileName)
{

    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_BINARYTRANSFER, 1);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);

    if (file_exists($fileName)) {
        $from = filesize($fileName);
        curl_setopt($ch, CURLOPT_RANGE, $from . "-");//maybe "-".$from+1000 for 1MB chunks
    }

    $fp = fopen($fileName, "a");
    if (!$fp) {
        exit;
    }
    curl_setopt($ch, CURLOPT_FILE, $fp);
    $result = curl_exec($ch);
    curl_close($ch);

    fclose($fp);

}

?>
Schwern
  • 153,029
  • 25
  • 195
  • 336
Ime Imee
  • 89
  • 2
  • 7
  • This will be helpful. http://stackoverflow.com/questions/2032924/how-to-partially-download-a-remote-file-with-curl – kpotehin Oct 21 '12 at 00:53

2 Answers2

0

If your intent is to download a file over a flaky connection, curl has a --retry flag to automatically retry the download in case of error and continue where it left off. Unfortunately it seems the PHP library is missing that option because libcurl is also missing that option.

Normally I recommend using a library rather than an external command, but rather than rolling your own it may be simpler in this case to just invoke curl --retry or curl -C - on the command line. wget -c is another option.

Otherwise I don't see the need to always get the data in chunks. Download as much as you can and if there's an error resume using CURLOPT_RANGE and the file size as you are now.

Schwern
  • 153,029
  • 25
  • 195
  • 336
0

This is my solution for chunked file download using PHP, not using curl, but fopen:

//set the chunnk size, how much would you like to transfer in one go
$chunksize = 5 * (1024 * 1024);
//open your local file with a+ access (appending to the file = writing at end of file)
$fp = fopen ($local_file_name, 'a+');
if($fp === false)
{
    //error handling, local file cannot be openened
}
else
{
    //open remote file with read permission, you need to have allow_url_fopen to be enabled on our server if you open here a URL
    $handle = fopen($temp_download, 'rb');
    if($handle === false)
    {
        //error handling, remote file cannot be opened
    }
    else
    {
        //while we did not get to the end of the read file, loop
        while (!feof($handle))
        { 
            //read a chunk of the file
            $chunk_info = fread($handle, $chunksize);
            if($chunk_info === false)
            {
                //error handling, chunk reading failed
            }
            else
            {
                //write the chunk info we just read to the local file
                $succ = fwrite($fp, $chunk_info);
                if($succ === false)
                {
                    //error handling, chunk info writing locally failed
                }
            }
        } 
        //close handle
        fclose($handle);
    }
}
//close handle
fclose($fp);