471

I would like to concatenate a number of text files into one large file in terminal. I know I can do this using the cat command. However, I would like the filename of each file to precede the "data dump" for that file. Anyone know how to do this?

what I currently have:

file1.txt = bluemoongoodbeer

file2.txt = awesomepossum

file3.txt = hownowbrowncow

cat file1.txt file2.txt file3.txt

desired output:

file1

bluemoongoodbeer

file2

awesomepossum

file3

hownowbrowncow
Evan Carroll
  • 78,363
  • 46
  • 261
  • 468
Nick
  • 5,411
  • 4
  • 19
  • 7

21 Answers21

716

Was looking for the same thing, and found this to suggest:

tail -n +1 file1.txt file2.txt file3.txt

Output:

==> file1.txt <==
<contents of file1.txt>

==> file2.txt <==
<contents of file2.txt>

==> file3.txt <==
<contents of file3.txt>

If there is only a single file then the header will not be printed. If using GNU utils, you can use -v to always print a header.

DS.
  • 22,632
  • 6
  • 47
  • 54
  • 2
    This works with the GNU tail (part of GNU Coreutils) as well. – ArjunShankar Apr 10 '12 at 16:35
  • 1
    I want to concatenate a large number of files. If I use this (`tail +1 *.txt`) I get an error `Too many open files`. I fixed this with: ``OLD_ULIMIT=`ulimit -n`;ulimit -n 1024;tail +1 *.txt;ulimit -n $OLD_ULIMIT;``. Where 1024 was large enough for me. – Alec Jacobson Nov 06 '13 at 11:02
  • 17
    Awesome `-n +1` option! An alternative: `head -n-0 file1 file2 file3`. – Frozen Flame Dec 20 '15 at 03:22
  • 6
    Works great with both BSD tail and GNU tail on MacOS X. You can leave out the space between `-n` and `+1`, as in `-n+1`. – vdm Nov 22 '17 at 10:26
  • 7
    `tail -n +1 *` was exactly was I was looking for, thanks! – kR105 Jun 20 '18 at 16:39
  • That, or `ls | xargs tail -n +1` should also work fine. – Bar Sep 21 '18 at 01:21
  • 2
    works on MacOsX 10.14.4 `sudo ulimit -n 1024; find -f . -name "*.rb" | xargs tail -n+1 > ./source_ruby.txt` – kolas Apr 09 '19 at 12:38
  • Also if you need to concatenate without file names use `-q` (for silent) flag with `head` or `tail` command. – Coddy May 19 '20 at 16:54
  • 1
    I've always found it bizarre that this is the default behavior for `head` & `tail`, yet it's not even an option for `cat`. AND it still hasn't been added after all these years. Like, what?? I use this so often that I added `alias cats='head -n -0'` – Mike B May 27 '20 at 05:14
  • I tried `tail -n +1 *` as well, but it does not skip dirs and will be interrupted by them. Is there a way to skip dirs? – mgutt Sep 29 '20 at 08:16
  • As a bash function (SO comment doesn't support line breaks): `tails() { tail -v -n +1 "$@" | less }`. – young_souvlaki May 19 '21 at 16:52
  • can we show only filename instead of its full path? – alper Mar 31 '23 at 16:54
  • Why works=> "The tail -n +1 command with the +1 argument reads each file starting from the first line and displays its content." – Memin Aug 15 '23 at 17:00
245

I used grep for something similar:

grep "" *.txt

It does not give you a 'header', but prefixes every line with the filename.

Theo
  • 2,467
  • 1
  • 13
  • 2
  • 16
    Output breaks if `*.txt` expands to only one file. In this regard, I'd advise `grep '' /dev/null *.txt` – antak Jul 10 '14 at 03:39
  • 2
    +1 for showing me a new use for grep. this met my needs perfectly. in my case, each file only contained one line, so it gave me a neatly formatted output that was easily parsable – verboze Mar 31 '15 at 21:03
  • 11
    `grep` will only print file headers if there is more than one file. If you want to make sure to print the file path always, use `-H`. If you don't want the headers use `-h`. Also note it will print `(standard input)` for STDIN. – Jorge Bucaran Sep 30 '15 at 22:31
  • 1
    You can also use `ag` (the silver searcher): by default, `ag . *.txt` prefixes each file with its name, and each line with its number. – anol Jul 25 '16 at 11:32
  • If you want to reference a different directory, you can control the name relative root with this one liner... `(cd /var/log; grep "" */*.log)`. This will print names relative to `/var/log` – Basic Mar 08 '17 at 01:36
  • 1
    Of note, passing `-n` to grep also yields line numbers which allows you to write simple linters with pinpointing that could be picked up by, e.g., emacs. – DepressedDaniel Mar 13 '17 at 03:23
  • nice one. I wasn't aware of it. – JasonGenX Feb 07 '19 at 15:56
131

This should do the trick as well:

$ find . -type f -print -exec cat {} \;
./file1.txt
Content of file1.txt
./file2.txt
Content of file2.txt

Here is the explanation for the command-line arguments:

find    = linux `find` command finds filenames, see `man find` for more info
.       = in current directory
-type f = only files, not directories
-print  = show found file
-exec   = additionally execute another linux command
cat     = linux `cat` command, see `man cat`, displays file contents
{}      = placeholder for the currently found filename
\;      = tell `find` command that it ends now here

You further can combine searches trough boolean operators like -and or -or. find -ls is nice, too.

Flimm
  • 136,138
  • 45
  • 251
  • 267
Maxim_united
  • 1,911
  • 1
  • 14
  • 23
  • 2
    Could you explain more what this command does? Is exactly what I Needed – AAlvz Feb 09 '14 at 02:57
  • and actually it works fine also without the `-print` instruction – AAlvz Feb 09 '14 at 03:08
  • 5
    This is linux' standard find command. It searches all files in the current directory, prints their name, then for each one, cats the file. Omitting the `-print` won't print the filename before the `cat`. – Maxim_united Apr 17 '14 at 09:45
  • 20
    You can also use `-printf` to customize the output. For example: `find *.conf -type f -printf '\n==> %p <==\n' -exec cat {} \;` to match the output of `tail -n +1 *` – Maxim_united Apr 17 '14 at 09:55
  • 1
    -printf doesn't work on mac, unless you want to `brew install findutils` and then use `gfind` instead of `find`. – Matt Fletcher Jan 13 '16 at 11:25
  • `cat` from GNU coreutils 8.22 does not include the header by default, and does not appear to have an option for doing so. Combining your `find` syntax with DS.'s answer, the command `find . -type f -name file* -exec tail -n +1 {} +;` works exactly. – Patrick M Aug 01 '18 at 18:30
  • 2
    If you want colors you can use that fact that `find` allows multiple `-exec`s: `find -name '*.conf' -exec printf '\n\e[33;1m%s\e[0m\n' {} \; -exec cat {} \;` – banbh Mar 20 '19 at 17:24
  • this one gets the order wrong for me. I want the files concatenated according to their numbering. – user313032 Aug 29 '22 at 02:47
30

When there is more than one input file, the more command concatenates them and also includes each filename as a header.

To concatenate to a file:

more *.txt > out.txt

To concatenate to the terminal:

more *.txt | cat

Example output:

::::::::::::::
file1.txt
::::::::::::::
This is
my first file.
::::::::::::::
file2.txt
::::::::::::::
And this is my
second file.
Asclepius
  • 57,944
  • 17
  • 167
  • 143
Steinfadt
  • 309
  • 3
  • 2
  • @Acumenus For myself I had to use: `String="${String//$'\n'::::::::::::::$'\n'/|}"` then: `String="${String//::::::::::::::$'\n'/}"` and finally: `String="${String//$'\n'/|}"` to make into a YAD array: `IFS='|' read -ra OLD_ARR <<< "$String"` – WinEunuuchs2Unix Jul 12 '20 at 19:57
  • @Acumenus First I had to build the string field using: `String=$(sudo -u "$SUDO_USER" ssh "$SUDO_USER"@"$TRG_HOST" \ "find /tmp/$SUDO_USER/scp.*/*.Header -type f \ -printf '%Ts\t%p\n' | sort -nr | cut -f2 | \ xargs more | cat | cut -d'|' -f2,3" \ )` – WinEunuuchs2Unix Jul 12 '20 at 20:04
  • @WinEunuuchs2Unix I don't follow. If you'd like, you can explain more clearly in a https://gist.github.com/, and link to it instead. – Asclepius Jul 12 '20 at 20:06
  • @Acumenus Better yet I'll upload the script to github when done. It's just to copy root files between hosts using sudo because hosts don't have root accounts. The code in comments was to select headers for previous payloads. Kind of a fringe thing that won't interest most users. – WinEunuuchs2Unix Jul 12 '20 at 21:27
  • 1
    This does not work for me on MacOS. I only get the file contents. – ben-albrecht Jan 20 '21 at 21:01
  • Cool trick. I got the output as I needed. Thanks. – sourabh kesharwani Mar 09 '22 at 11:38
28

This should do the trick:

for filename in file1.txt file2.txt file3.txt; do
    echo "$filename"
    cat "$filename"
done > output.txt

or to do this for all text files recursively:

find . -type f -name '*.txt' -print | while read filename; do
    echo "$filename"
    cat "$filename"
done > output.txt
Chris Eberle
  • 47,994
  • 12
  • 82
  • 119
  • 1
    didn't work. I just wrote some really ugly awk code: for i in $listoffiles do awk '{print FILENAME,$0,$1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11}' $i >> concat.txt done – Nick May 06 '11 at 22:13
  • 2
    ...care to elaborate? That's about as simple as bash code gets. – Chris Eberle May 06 '11 at 22:14
  • 1
    @Nick: your awk line shouldn't even work, considering that `$0` is the entire line, so you've actually got repeating columns in there... – Chris Eberle May 06 '11 at 22:20
  • @Nick: Nifty solution otherwise :) – Chris Eberle May 06 '11 at 22:27
  • @Chris: yes, but its a lot uglier than I would like it to be. Maybe your code wasn't working for me because I'm using >> to catch the stdout? – Nick May 06 '11 at 22:28
  • @Nick: the other possibility is that your list of filenames have spaces. `for` in bash is rather dumb and splits on any whitespace. – Chris Eberle May 06 '11 at 22:55
  • make sure not to use a * as the wildcard... it winds up infinitely recurring for some reason. – jayunit100 Mar 23 '12 at 20:15
  • @jayunit100: the infinite recurring is if your target file matches the wild card. name it such that it does not match. – rshetye Apr 06 '14 at 03:18
  • I like this method as it gives much more flexibility with how you want to present the output! – Matt Fletcher May 25 '15 at 15:49
15
find . -type f -print0 | xargs -0 -I % sh -c 'echo %; cat %'

This will print the full filename (including path), then the contents of the file. It is also very flexible, as you can use -name "expr" for the find command, and run as many commands as you like on the files.

Kevin
  • 2,761
  • 1
  • 27
  • 31
  • 2
    It is also quite straightforward to combine with `grep`. To use with `bash`: `find . -type f -print | grep PATTERN | xargs -n 1 -I {} -i bash -c 'echo ==== {} ====; cat {}; echo'` – dojuba May 26 '15 at 11:06
7

And the missing awk solution is:

$ awk '(FNR==1){print ">> " FILENAME " <<"}1' *
kvantour
  • 25,269
  • 4
  • 47
  • 72
  • Thanks for your great answer. Can you explain the meaning of the `1` at the end of expression? – bwangel Sep 01 '19 at 03:17
  • 1
    Thanks. You can have a look at [what is the meaning of 1 at the end of awk script](https://unix.stackexchange.com/questions/63891) – kvantour Sep 01 '19 at 15:22
  • `awk '{ if (FNR == 1) print "==>" FILENAME "<=="; print }' *.txt`, `print` would be better than `1`. And `awk '{ if (FNR == 1) print "\033[31m==>" FILENAME "<==\033[0m" > "/dev/stderr"; print }' *.txt` print file name with color to `stderr`. – Míng Nov 28 '22 at 17:02
  • @Mr.Míng This comment does not give any additional value. Furthermore, the whole action statement goes against the ideology of the language syntax. – kvantour Nov 28 '22 at 19:17
  • @kvantour Indeed, I don't know much about the ideology, but I just think `print` is more readable than `1`, especially for people like bwangel and me. – Míng Nov 28 '22 at 20:14
6

This is how I normally handle formatting like that:

for i in *; do echo "$i"; echo ; cat "$i"; echo ; done ;

I generally pipe the cat into a grep for specific information.

5

I like this option

for x in $(ls ./*.php); do echo $x; cat $x | grep -i 'menuItem'; done

Output looks like this:

./debug-things.php
./Facebook.Pixel.Code.php
./footer.trusted.seller.items.php
./GoogleAnalytics.php
./JivositeCode.php
./Live-Messenger.php
./mPopex.php
./NOTIFICATIONS-box.php
./reviewPopUp_Frame.php
            $('#top-nav-scroller-pos-<?=$active**MenuItem**;?>').addClass('active');
            gotTo**MenuItem**();
./Reviews-Frames-PopUps.php
./social.media.login.btns.php
./social-side-bar.php
./staticWalletsAlerst.php
./tmp-fix.php
./top-nav-scroller.php
$active**MenuItem** = '0';
        $active**MenuItem** = '1';
        $active**MenuItem** = '2';
        $active**MenuItem** = '3';
./Waiting-Overlay.php
./Yandex.Metrika.php
ch3ll0v3k
  • 336
  • 3
  • 9
4

If the files all have the same name or can be matched by find, you can do (e.g.):

find . -name create.sh | xargs tail -n +1

to find, show the path of and cat each file.

drkvogel
  • 2,061
  • 24
  • 17
4

you can use this simple command instead of using a for loop,

ls -ltr | awk '{print $9}' | xargs head
keyser
  • 18,829
  • 16
  • 59
  • 101
Gagan
  • 41
  • 1
3

If you like colors, try this:

for i in *; do echo; echo $'\e[33;1m'$i$'\e[0m'; cat $i; done | less -R

or:

tail -n +1 * | grep -e $ -e '==.*'

or: (with package 'multitail' installed)

multitail *
sjas
  • 18,644
  • 14
  • 87
  • 92
  • 1
    For the sake of colors highlighting the filename: `find . -type f -name "*.txt" | xargs -I {} bash -c "echo $'\e[33;1m'{}$'\e[0m';cat {}"` – MediaVince Jan 05 '17 at 10:22
2

If you want to replace those ugly ==> <== with something else

tail -n +1 *.txt | sed -e 's/==>/\n###/g' -e 's/<==/###/g' >> "files.txt"

explanation:

tail -n +1 *.txt - output all files in folder with header

sed -e 's/==>/\n###/g' -e 's/<==/###/g' - replace ==> with new line + ### and <== with just ###

>> "files.txt" - output all to a file

boroboris
  • 1,548
  • 1
  • 19
  • 32
2

I made a combination of:

cat /sharedpath/{unique1,unique2,unique3}/filename > newfile

and

tail -n +1 file1 file2

into this:

tail -n +1 /sharedpath/{folder1,folder2,...,folder_n}/file.extension | cat > /sharedpath/newfile

The result is a newfile that contains the content from each subfolder (unique1,unique2..) in the {} brackets, separated by subfolder name.

note unique1=folder1

In my case the file.extension has the same name in all subfolders.

MartiTro
  • 31
  • 3
2

Here is a really simple way. You said you want to cat, which implies you want to view the entire file. But you also need the filename printed.

Try this

head -n99999999 * or head -n99999999 file1.txt file2.txt file3.txt

Hope that helps

Trey Brister
  • 363
  • 2
  • 5
1
find . -type f -exec cat {} \; -print
Matt Clark
  • 27,671
  • 19
  • 68
  • 123
1
  • AIX 7.1 ksh

... glomming onto those who've already mentioned head works for some of us:

$ r head
head file*.txt
==> file1.txt <==
xxx
111

==> file2.txt <==
yyy
222
nyuk nyuk nyuk

==> file3.txt <==
zzz
$

My need is to read the first line; as noted, if you want more than 10 lines, you'll have to add options (head -9999, etc).

Sorry for posting a derivative comment; I don't have sufficient street cred to comment/add to someone's comment.

0

If you want the result in the same format as your desired output you can try:

for file in `ls file{1..3}.txt`; \
do echo $file | cut -d '.' -f 1; \ 
cat $file  ; done;

Result:

file1
bluemoongoodbeer
file2
awesomepossum
file3
hownowbrowncow

You can put echo -e before and after the cut so you have the spacing between the lines as well:

$ for file in `ls file{1..3}.txt`; do echo $file | cut -d '.' -f 1; echo -e; cat $file; echo -e  ; done;

Result:

file1

bluemoongoodbeer

file2

awesomepossum

file3

hownowbrowncow
  • Explanation: `for` loop goes through the list the `ls` command resulted. `$ ls file{1..3}.txt` Result: `file1.txt file2.txt file3.txt` each iteration will `echo` the `$file` string then it's piped into a `cut` command where I used `.` as a field separator which breaks fileX.txt into two pieces and prints out the field1 (field2 is the txt) The rest should be clear – Fekete Sumér Feb 01 '16 at 13:01
-1

This method will print filename and then file contents:

tail -f file1.txt file2.txt

Output:

==> file1.txt <==
contents of file1.txt ...
contents of file1.txt ...

==> file2.txt <==
contents of file2.txt ...
contents of file2.txt ...
serenesat
  • 4,611
  • 10
  • 37
  • 53
  • 4
    The `-f` is useful if you want to track a file which is being written to, but not really within the scope of what the OP asked. – tripleee Sep 04 '15 at 09:05
  • This solution only prints the last few lines - not the whole files' contents. – stason Oct 24 '20 at 05:37
-2

For solving this tasks I usually use the following command:

$ cat file{1..3}.txt >> result.txt

It's a very convenient way to concatenate files if the number of files is quite large.

Igor
  • 477
  • 5
  • 13
-3

First I created each file: echo 'information' > file1.txt for each file[123].txt.

Then I printed each file to makes sure information was correct: tail file?.txt

Then I did this: tail file?.txt >> Mainfile.txt. This created the Mainfile.txt to store the information in each file into a main file.

cat Mainfile.txt confirmed it was okay.

==> file1.txt <== bluemoongoodbeer

==> file2.txt <== awesomepossum

==> file3.txt <== hownowbrowncow

John Ray
  • 1
  • 1