I know there are lots of posts similar to this, but after crawling SO, still not found the answer.
I am looking to write a script that acts as a proxy for downloading large remote images (around 10mb each). So far I am using curl to read in the remote image url and then using headers to force a download. Something like (not the full script):
function getRemoteFile($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 50);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
header('Content-Type: octet/stream');
header('Content-Disposition: attachment; filename="random.jpg"');
header('Content-Length: ' . strlen($file));
echo $file;
This works, but is there a better way as this script may see quite a lot of traffic - maybe 300 concurrent users with 10 requests each?
The images will be served from a server on the same network.