15

I need to copy some big file (6 GB) via PHP. How can I do that? The Copy() function can't do it.

I am using PHP 5.3 on Windows 32/64.

Bauer01
  • 281
  • 1
  • 3
  • 13
  • 3
    Why cant `copy` do that? Do you get an error message? – Gordon Jul 03 '11 at 21:48
  • Copy to where? Can the destination filesystem support 6GB files? – Lightness Races in Orbit Jul 03 '11 at 22:06
  • I thought that PHP has some limits (> 2 GB) http://www.php.net/manual/en/function.copy.php#69001 copy() get only a few MB from file, no error. There is also problem with filesize() ... yes, destination filesystem is NTFS – Bauer01 Jul 03 '11 at 23:24
  • @Gordon Because copy() seems to revert the internal offset position back to 0 after every 4 GiB, even on 64-bit builds of PHP on 64-bit systems. – StanE Jul 28 '21 at 22:17
  • 1
    @StanE that seems to be file system related. See https://bugs.php.net/bug.php?id=81145. – Gordon Jul 29 '21 at 08:07

6 Answers6

23

This should do it.

function chunked_copy($from, $to) {
    # 1 meg at a time, you can adjust this.
    $buffer_size = 1048576; 
    $ret = 0;
    $fin = fopen($from, "rb");
    $fout = fopen($to, "w");
    while(!feof($fin)) {
        $ret += fwrite($fout, fread($fin, $buffer_size));
    }
    fclose($fin);
    fclose($fout);
    return $ret; # return number of bytes written
}
Dogbert
  • 212,659
  • 41
  • 396
  • 397
2

Recent versions of PHP copy files with chunks so today you could use php copy() function safely

  • 2
    What does "recent" mean here. This answer would be much better with a version number... – ftrotter Mar 25 '19 at 06:30
  • No, I don't think that this is correct. I am using PHP 8.0.7 compiled as 64-bit (so it uses 8-byte integers internally). copy() still doesn't support copying files larger than 4 GiB. It does copy them, but it starts to write from offset 0 after ever 4 GiB. – StanE Jul 28 '21 at 22:16
  • PHP 8.1.9 stucks on copying by `copy()` of 4GB on ~2.7GB – Flash Thunder Aug 26 '22 at 09:35
2

If copy doesnt work, you can try with

Example

stream_copy_to_stream(
    fopen('/path/to/input/file.txt', 'r'),
    fopen('/path/to/output/file.txt', 'w+')
);

Also see https://bugs.php.net/bug.php?id=81145

Gordon
  • 312,688
  • 75
  • 539
  • 559
  • 1
    You should probably fclose these afterwards unless your script execution stops immediately afterwards. – Ciaran McNulty Dec 21 '12 at 10:26
  • I tried this on a 64-bit Windows 10 build with 64-bit build of PHP 8.0.7. stream_copy_to_stream() seems to have the same problem with file sizes over 4 GiB. After ever 4 GiB the file contents repeat. – StanE Jul 28 '21 at 22:29
1

You could use exec() if it's a linux machine.

$srcFile = escapeshellarg($pathToSrcFile);
$trgFile = escapeshellarg($pathToTrgFile);

exec("cp $srcFile $trgFile");
enyo
  • 16,269
  • 9
  • 56
  • 73
1

I would copy it X byte by X byte (several megs each iteration).
X will be the most optimized size which depends on your machine.
And I would do it not through the web server but as a stand alone script, run through cron or one time call to it (cli).

Itay Moav -Malimovka
  • 52,579
  • 61
  • 190
  • 278
0

If you want to copy files from one server to another and you have ftp access on both of them, then you can simply use ftp 'put' command on source system and send the big file to the other system easily.