0

I am trying to download a 7GB file using php-curl however it seems that it will only download the first 2GB.

There doesn't seem to be any documentation or talk about this.

Anyone have any ideas?

Jake Cattrall
  • 461
  • 4
  • 20
  • different file system has limit on the file size, http://en.wikipedia.org/wiki/Comparison_of_file_systems , which one is yours? – ajreal Jul 29 '12 at 16:58
  • http://curl.haxx.se/mail/lib-2003-08/0145.html http://curl.haxx.se/mail/lib-2010-08/0054.html – ajreal Jul 30 '12 at 16:51

2 Answers2

2

Here are two useful links on the topic:

Downloading a large file using curl

How to partially download a remote file with cURL?

Basically you may have two problems here:

  • You are reading into memory first as as such exhausting PHPs memory allocation
  • You need to chunk download the file to overcome certain restrictions in the HTTP protocol.

There are also file sytem limitations and what not so check your file system type as mentioned by @ajreal (i.e. FAT32 has a limit of 4GB, 99% chance your not using FAT but still it is an example).

As the OP found out it was do with the DB:

Turns out it was a database issue. File sizes were stored in a mysql database, the sizes were in bytes and max size for "int" column is 2147483648. Changing the column type to "bigint" fixed the issue.

Community
  • 1
  • 1
Sammaye
  • 43,242
  • 7
  • 104
  • 146
  • I am using content disposition in PHP, with your first link how can I use this to echo out bytes instead of writing to a file? – Jake Cattrall Jul 29 '12 at 17:27
  • @JakeCattrall Instead of using cURL opt file you would just get the contents as normal with the 2GB problem, slowly (or quickly) filling the servers memory with this file, which may I say is unadvised. – Sammaye Jul 29 '12 at 18:35
  • Does CURLOPT_WRITEFUNCTION use ram? Even so, I have 34GB ram so I'm not really worried about that. I just need to bypass this limit somehow... – Jake Cattrall Jul 29 '12 at 19:22
  • Turns out it was a database issue. File sizes were stored in a mysql database, the sizes were in bytes and max size for "int" column is 2147483648. Changing the column type to "bigint" fixed the issue. – Jake Cattrall Jul 30 '12 at 22:13
  • @JakeCattrall Just noticed you re;plied earlier; aha thanks for following it up :), I'll add it to the answer – Sammaye Jul 30 '12 at 22:56
0

Assuming your file system can handle files larger than 2GB you can try using copy

copy("http:://example.org/your_file","/tmp/your_file");

Also make sure you set an appropriate time limit (with set_time_limit(...)).

Vatev
  • 7,493
  • 1
  • 32
  • 39