3

I am trying to merge all files in a dir into one text file with PHP.

All the files are text files and have one word on each line.

I just need to combine them all into one text file with all the words, one on each line.

The files extensions are numbers 0-5 multiples of 5 (.10, .15, .20, .25) and .txt files.

This is what I have so far, but it only makes an empty text file.

<?php
$files = glob("./*.??");
$out = fopen("listTogether.txt", "w");
foreach($files as $file){
    $in = fopen($file, "r");
    while($line = fread($in)){
       fwrite($out, $line);
       echo "writing file";
    }
    fclose($in);
}
fclose($out);
?>
<!Doctype HTML>
<html lang="en">
<head>
<meta charset="utf-8" />
<title>file merger</title>
</head>
<body>
    <p>Doo Wop</p>
</body>
 </html>
Django Johnson
  • 1,383
  • 3
  • 21
  • 40
  • 1
    Why not simply `fwrite($out, file_get_contents($file))`? Reduces the loop body to just one line. – Jon Jun 06 '13 at 23:34
  • Doesn't this code throw errors? fread() requires two arguments . . . I'd use [file_get_contents](http://us2.php.net/manual/en/function.file-get-contents.php) instead. You're also not going to read any of your .txt files since your glob pattern only matches two character extensions. – ernie Jun 06 '13 at 23:34
  • Your code seems to be ok. You can always add `echo "open:", $file;` and `echo "write:", $line;` to see what PHP is doing (step by step) – furas Jun 06 '13 at 23:36
  • I'd also suggest not creating new questions based on someone else's answer to your [other question](http://stackoverflow.com/questions/16972355/merge-all-files-in-directory-to-one-text-file). – ernie Jun 06 '13 at 23:38
  • @Jon That fixed it. Can you write that as an answer so I can select it? – Django Johnson Jun 06 '13 at 23:40

3 Answers3

3

The problem is that the read/append loop doesn't read properly. If you 're not processing very large files then you can just simplify it to

foreach ($files as $file) {
    fwrite($out, file_get_contents($file));
}
Jon
  • 428,835
  • 81
  • 738
  • 806
  • What's the best policy if you are processing large files? – rbassett Jul 27 '16 at 16:46
  • 1
    @rbassett in that case `file_get_contents` is not the right tool. Stream the contents in smaller chunks, e.g. with [`stream_copy_to_stream`](http://php.net/manual/en/function.stream-copy-to-stream.php). – Jon Jul 27 '16 at 18:40
2

There are 2 issues with your code:

  • The call to fread() requires a parameter indicating maximum number of characters you can read for that file. It is not elegant, but you can set it to something like 20 (if the file is longer, the while loop will execute until all the content is consumed).

    while($line = fread($in,20)){
       fwrite($out, $line);
       echo "writing file";
    }
    
  • This might be a problem or note, but your glob statement says '.??'. So you can capture files like "foo.55", but not "bar.5", or "mine.100". You might want to change it to ".[0-9]*"

    $files = glob("./*.[0-9]*");
    
jzer7
  • 171
  • 4
1

If you can (eg. if you do not need to process the files before concatenation and have appropriate OS), really preferable way is to handle that as close to the OS, as possible:

exec('cat *.?? > listTogether.txt');

If necessary, you can change working directory either within the command itself (by appending "cd /some/other/directory &&", taking into account safe mode limitations regarding paths), or using chdir().

Tadeck
  • 132,510
  • 28
  • 152
  • 198