0

I'm trying to archive a big file using PHP and send it to the browser for download. The problem is the file is located on a remote machine and the only way to get it is via HTTP. So imagine this is my file: https://dropboxcontent.com/user333/3yjdsgf/video1.mp4 It's a direct link and I can download the file using wget, or curl anything. When a user wants to download it, I first fetch the file to the server, then zip it up and then send it to the user. Well, if the file is really large, the user has to sit there waiting for the server to download it before he sees the download dialog box in his browser. Is there a way for me to start the download of the file https://dropboxcontent.com/user333/3yjdsgf/video1.mp4 (let's say I'm downloading it into a local /tmp/video.mp4) and simultaneously start putting into an archive and streaming it into the user's browser?

I'm using this library to zip it up: https://github.com/barracudanetworks/ArchiveStream-php, which works great, but the bottleneck is still fetching the file to the server's local filesystem.

Here is my code:

$f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4"); $zip->add_file('big/hello.mp4', $f);

The problem is line $f = file_get_contents("https://dropboxcontent.com/user333/3yjdsgf/video1.mp4"); takes too long if the file is really big.

codemonkey
  • 7,325
  • 5
  • 22
  • 36

1 Answers1

0

As suggested in the comments of the original post by Touch Cat Digital Inc, I found the answer here: https://stackoverflow.com/a/6914986/1927991

A chunked stream of the remote file was the answer. Very clever.

Community
  • 1
  • 1
codemonkey
  • 7,325
  • 5
  • 22
  • 36