I'd like to know if there is a faster way of concatenating 2 text files in PHP, than the usual way of opening txt1
in a+
, reading txt2
line by line and copying each line to txt1
.

- 2,962
- 7
- 39
- 59

- 5,651
- 12
- 59
- 92
-
You could use exec to join the files in Unix. – karmafunk Jul 01 '13 at 15:02
-
How are you copying each line to txt1? – sroes Jul 01 '13 at 15:02
-
1You could use `file_get_contents` to get the entire file at once. – Bart Friederichs Jul 01 '13 at 15:03
-
http://php.net/manual/en/function.file-get-contents.php "file_get_contents() is the preferred way to read the contents of a file into a string. *It will use memory mapping techniques if supported by your OS to enhance performance*." – Marcello Romani Jul 01 '13 at 15:06
-
5An important question however is _how big_ the files involved are. I'm not sure `file_get_contents` is the best method to concatenate some GB-sized files... – Marcello Romani Jul 01 '13 at 15:08
-
Why do you want to use PHP for this? What have you tried so far, where are you stuck? – Nico Haase May 25 '20 at 06:50
5 Answers
If you want to use a pure-PHP solution, you could use file_get_contents
to read the whole file in a string and then write that out (no error checking, just to show how you could do it):
$fp1 = fopen("txt1", 'a+');
$file2 = file_get_contents("txt2");
fwrite($fp1, $file2);

- 448
- 6
- 11

- 33,050
- 15
- 95
- 195
-
4That's good if you have plenty of memory and your files are not very large. – scott80109 Oct 14 '14 at 19:25
-
1
It's probably much faster to use the cat
program in linux if you have command line permissions for PHP
system('cat txt1 txt2 > txt3');

- 3,142
- 4
- 31
- 46
-
1'cat' not recognized as an internal command or external , one level oper program or batch file. (Appserv on windows 7. PHP Version 5.2.6) – Guttemberg Oct 21 '15 at 23:47
-
@Guttemberg sorry, I was assuming that this was being run on a *nix server – Patrick Oct 22 '15 at 14:44
-
I've found `cat` is as fast (slow!) as Blackfire's `file_get/put_contents` answer for ~ 400 files of ~ 1 MB. The downside of `cat` is you can't make a progress bar thingie. – Rudie Dec 18 '15 at 14:52
-
file_get/put_contents is very stupid unless your files are very small. try to concatenate several 1GB files with 512M in php.ini ;) – Tertium Dec 20 '16 at 20:34
-
$content = file_get_contents("file1");
file_put_contents("file2", $content, FILE_APPEND);

- 129
- 1
- 8
I have found using *nix cat
to be the most effective here, but if for whatever reason you don't have access to it, and you are concatenating large files, then you can use this line by line function. (Error handling stripped for simplicity).
function catFiles($arrayOfFiles, $outputPath) {
$dest = fopen($outputPath,"a");
foreach ($arrayOfFiles as $f) {
$FH = fopen($f,"r");
$line = fgets($FH);
while ($line !== false) {
fputs($dest,$line);
$line = fgets($FH);
}
fclose($FH);
}
fclose($dest);
}

- 1,965
- 2
- 15
- 19
While the fastest way is undobtedly to use OS commands, like cp or cat, this is hardly advisable for compatibility.
The fastest "PHP only" way is using file_get_contents, that reads the whole source file, in one shot but it also has some drawbacks. It will require a lot of memory for large files and for this reason it may fail depending on the memory assigned to PHP.
A universal clean and fast solution is to use fread and fwrite with a large buffer.
If the file is smaller than the buffer, all reading will happen in one burst, so speed is optimal, otherwise reading happens at big chunks (the size of the buffer) so the overhead is minimal and speed is quite good.
Reading line by line with fgets instead, has to test for every charachter, one by one, if it's a newline or line feed. Also, reading line by line with fgets a file with many short lines will be slower as you will read many little pieces, of different sizes, depending of where newlines are positioned.
fread is faster as it only checks for EOF (which is easy) and reads files using a fixed size chunk you decide, so it can be made optimal for your OS or disk or kind of files (say you have many files <12k you can set the buffer size to 16k so they are all read in one shot).
// Code is untested written on mobile phone inside Stack Overflow, comes from various examples online you can also check.
<?php
$BUFFER_SIZE=1*1024*1024; // 1MB, bigger is faster.. depending on file sizes and count
$dest = fopen($fileToAppendTo "a+");
if (FALSE === $dest) die("Failed to open destination");
$handle = fopen("source.txt", "rb");
if (FALSE === $handle) {
fclose($dest);
die("Failed to open source");
}
$contents = '';
while( !feof($handle) ) {
fwrite($dest, fread($handle, $BUFFER_SIZE) );
}
fclose($handle);
fclose($dest);
?>

- 2,845
- 1
- 18
- 29
-
1Please add some explanation to your answer such that others can learn from it - especially as this looks like an improvement to the answer by lufc – Nico Haase May 25 '20 at 06:51