1

I have a large file 500MB at the url http://domain.com/somefile.mxt. Because it's a large file, I'm not sure what PHP command is more suited for this. Then how do I save it after that?

p.s. the file extension may be anything, not just .mxt

I'm also working with the Zend Framework, so if there's anything more specific to Zend, that would be helpful

animuson
  • 53,861
  • 28
  • 137
  • 147
sameold
  • 18,400
  • 21
  • 63
  • 87
  • possible duplicate of [PHP readfile() and large downloads](http://stackoverflow.com/questions/2946791/php-readfile-and-large-downloads) – Marc B Nov 04 '11 at 04:34
  • 1
    possible duplicate of [php curl download to file](http://stackoverflow.com/questions/6409462/php-curl-download-to-file) – deceze Nov 04 '11 at 04:36
  • @Marc B, this link is about setting up something on the server to push to the client in a certain way. I'm not doing this. I'm the client and trying to just download – sameold Nov 04 '11 at 04:37

2 Answers2

2

To download it into the directory your script is running from (must have write permissions):

exec('wget http://domain.com/somefile.mxt');

To import it into a variable in the PHP script (no need for write permissions):

$content=file_get_contents('http://domain.com/somefile.mxt');

Using the above you can parse the file and put put the output into a local file with:

file_put_contents('somefile.mxt',$content);

For writing the file to disk you need write permissions in the directory you are putting it into.

Alasdair
  • 13,348
  • 18
  • 82
  • 138
  • 1
    I wouldn't recommend `file_get_contents` for a 500MB+ file though. Better write straight to disk using CURL, streams or, if you have to, `wget`. – deceze Nov 04 '11 at 04:51
  • For downloading 1 file, file_get_contents is virtually the same as CURL. CURL is much faster only if you're downloading many files or you want to do something fancy like follow redirects or login. And wget is very fast because it's done outside of PHP and has minimal overhead, so even though it's not actual PHP code, it is a very efficient way of downloading a file. Also, if you want to download multiple files then you can do them all at the same time with wget, which is much faster than anything other than using multithreading in CURL (which is complicated to implement). – Alasdair Nov 04 '11 at 05:02
  • The difference is that you can make CURL write the file directly to disk (see the duplicate I linked to), while `file_get_contents` always reads the data into memory, which is not a good idea for large files. – deceze Nov 04 '11 at 05:30
  • Yes, which is why I suggested wget to download if the file is not needed inside the script. – Alasdair Nov 04 '11 at 06:47
  • Also, this is obviously a beginner's question and CURL is not so user-friendly as wget or file_get_contents. In general I only use CURL for really serious projects, but if I just want to download a file with PHP then wget or file_get_contents are so much easier to use. – Alasdair Nov 04 '11 at 07:05
0

I've not used it, but it seems Zend HTTP can handle this if you use setRawData().

See Zend HTTP Data Streaming, here's the example code:

$client->setStream(); // will use temp file
$response = $client->request('GET');
// copy file
copy($response->getStreamName(), "my/downloads/file");
// use stream
$fp = fopen("my/downloads/file2", "w");
stream_copy_to_stream($response->getStream(), $fp);
// Also can write to known file
$client->setStream("my/downloads/myfile")->request('GET');
John Carter
  • 53,924
  • 26
  • 111
  • 144