0

my provider doesn't allow fopen(). So I used curl instead of this to download files. But this only works for smaller files (<100mb) because of server timout and memory limits. And I need to download large files (1gb).

Is there another way to download large files?

Here is my code:

$file = $_GET['dateiUrl'];

//set header
header("Cache-Control: public");
header("Content-Description: File Transfer");
header('Content-Type: application/force-download');
header("Content-Transfer-Encoding: binary");
header("Content-Disposition: attachment; filename=" . basename($file));
//curl part


set_time_limit(0);
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
curl_setopt($ch, CURLOPT_URL, $file);   
$data = curl_exec($ch);
curl_close($ch);

//output  
echo $data;
Philipp Kühn
  • 1,579
  • 2
  • 16
  • 25

2 Answers2

3

You may be able to slice the large file into smaller chunks.

Check out the "Content-Range:" header and HTTP response 206 ("Partial Content").

Of course, if you slice the file into chunks, you need something to assemble them back together again. But there are lots of ways to do that, and none of them should take much in the way of server resources. It will be interesting to see if you can manage to do it without using fopen(). :-)

See also: Difference between Content-Range and Range headers?

Community
  • 1
  • 1
Graham
  • 1,631
  • 14
  • 23
1

You may find CURLOPT_RESUME_FROM or CURLOPT_RANGE useful to retry downloading with a partially downloaded data.

nodakai
  • 7,773
  • 3
  • 30
  • 60