621

I am trying to use find -exec with multiple commands without any success. Does anybody know if commands such as the following are possible?

find *.txt -exec echo "$(tail -1 '{}'),$(ls '{}')" \;

Basically, I am trying to print the last line of each txt file in the current directory and print at the end of the line, a comma followed by the filename.

Andrea Spadaccini
  • 12,378
  • 5
  • 40
  • 54
Andy
  • 6,219
  • 3
  • 15
  • 3
  • 4
    http://superuser.com/questions/236601/how-do-i-execute-multiple-commands-when-using-find – Ignacio Vazquez-Abrams Feb 25 '11 at 16:45
  • 1
    As far as checking for the possibility of the command, did you not try it out on your system? – Sriram May 18 '11 at 11:28
  • 5
    From the `find` manual page: `There are unavoidable security problems surrounding use of the -exec option; you should use the -execdir option instead.`http://unixhelp.ed.ac.uk/CGI/man-cgi?find – JVE999 Oct 19 '14 at 19:47
  • 1
    Related: https://unix.stackexchange.com/questions/156008/is-it-possible-to-use-find-exec-sh-c-safely – Kusalananda Sep 03 '17 at 07:02
  • 2
    @JVE999 link is broken, alternative at https://ss64.com/bash/find.html – Keith M Nov 13 '18 at 22:12

14 Answers14

917

find accepts multiple -exec portions to the command. For example:

find . -name "*.txt" -exec echo {} \; -exec grep banana {} \;

Note that in this case the second command will only run if the first one returns successfully, as mentioned by @Caleb. If you want both commands to run regardless of their success or failure, you could use this construct:

find . -name "*.txt" \( -exec echo {} \; -o -exec true \; \) -exec grep banana {} \;
Alan W. Smith
  • 24,647
  • 4
  • 70
  • 96
Tinker
  • 9,340
  • 1
  • 15
  • 7
  • 1
    how to grep twice? this is failing: find ./* -exec grep -v 'COLD,' {} \; -exec egrep -i "my_string" {} \; – rajeev Jan 22 '13 at 16:08
  • 73
    @rajeev The second exec will only run if the return code for the first returns success, otherwise it will be skipped. This should probably be noted in this answer. – Caleb Mar 20 '14 at 14:54
  • 1
    Note the use of `-n` in some of the other answers to suppress the newline generated by echo, which is handy if your second command produces only one line of output and you want them to be easier to read. – William Turrell Apr 19 '18 at 12:07
  • Pipe the results of the first `-exec` into `grep`? ```find . -iname "*.srt" -exec xattr -l {} | grep "@type" \; > text.txt``` – John Sep 29 '20 at 17:52
  • Here is way to run the second command (`grep banana`) **only** if the first (`echo`) **failed**: `find . -iname '*.zip' \( -exec unzip {} \; -o -exec 7z x {} \; \)`. My use case is I want to try to unzip *.zip files with `unzip`, then with `7z` if `unzip` fails (or cannot be found). – CDuv Nov 28 '22 at 15:03
141
find . -type d -exec sh -c "echo -n {}; echo -n ' x '; echo {}" \;
bensiu
  • 24,660
  • 56
  • 77
  • 117
Avari
  • 1,435
  • 1
  • 9
  • 2
  • 6
    If you want to run Bash instead of Bourne you can also use `... -exec bash -c ...` instead of `... -exec sh -c ...`. – Kenny Evitt Oct 15 '16 at 21:00
  • 18
    Never embed `{}` in shell code. See https://unix.stackexchange.com/questions/156008/is-it-possible-to-use-find-exec-sh-c-safely – Kusalananda Sep 03 '17 at 07:03
  • 7
    +1 @Kusalananda Injecting filenames is fragile and insecure. Use parameters. see [SC2156](https://github.com/koalaman/shellcheck/wiki/SC2156) – pambda Sep 30 '17 at 09:04
  • 1
    `find . -type d -exec sh -c 'echo -n $0; echo -n " x "; echo $0' "{}" \;` solves the issues raised in the above comments – Jamie Pate Oct 28 '22 at 23:46
  • I hate it when I have to spawn a subshell to do this, because this makes variable substitution too complicated; you need to [escape the variable twice](https://unix.stackexchange.com/q/379181/247246) because simply doing `sh -c "command ${variable}"` will likely break. – Константин Ван Mar 03 '23 at 18:10
75

One of the following:

find *.txt -exec awk 'END {print $0 "," FILENAME}' {} \;

find *.txt -exec sh -c 'echo "$(tail -n 1 "$1"),$1"' _ {} \;

find *.txt -exec sh -c 'echo "$(sed -n "\$p" "$1"),$1"' _ {} \;
Dennis Williamson
  • 346,391
  • 90
  • 374
  • 439
  • 20
    What is the underscore before {} for? – qed Aug 01 '13 at 10:05
  • I am also curious about the underscore. – Xu Wang Aug 15 '13 at 00:34
  • 7
    @qed: It is a throw-away value that holds the place of `$0`. Try this with "foobar" instead of "_": `find /usr/bin -name find -exec sh -c 'echo "[$0] [$1]"' foobar {} \;` - the output: "[foobar] [/usr/bin/find]". – Dennis Williamson Aug 15 '13 at 01:20
  • @XuWang: Please see my answer to qed in the comment above. – Dennis Williamson Aug 15 '13 at 01:20
  • @DennisWilliamson ah, I see now. Am I correct that the purpose is to be able to use `$1` as the argument instead of `$0` because `$1` is usually interpreted as the parameter? Thus, it's just for readability? – Xu Wang Aug 15 '13 at 01:40
  • 1
    @XuWang: Yes, I would say that's the case. As you know, `$0` is usually the program name (`ARGV[0]`). – Dennis Williamson Aug 15 '13 at 03:31
  • 5
    It is critical, for this method, that the script passed to `sh -c` is in single quotes, not double. Otherwise `$1` is in the wrong scope. – Nick Mar 27 '15 at 15:02
  • 3
    @Nick quotes has nothing to do with it - you can write `'$1'` with double quotes as long as you escape the variable (`"\$1"`). You can escape other characters as well (`"\""`). – Camilo Martin Jun 05 '16 at 05:46
  • 1
    @DennisWilliamson: It's been a while, but is there a reason why you do not pass the program name to `sh`? Something like this: `find *.txt -exec sh -c '$0 "$(tail -n 1 "$1"),$1"' echo {} \;` – AkselA Oct 12 '17 at 11:13
  • Or even `find *.txt -exec sh -c 'echo "$(tail -n 1 "$0"),$0"' {} \;` – AkselA Oct 12 '17 at 11:58
  • @AkselA: Yours also work, but I find that mine is clearer. – Dennis Williamson Oct 12 '17 at 16:53
  • Yeah, they're not exactly crystal clear any of them. I found myself trying to explain the command to someone else, and in so doing I found putting both the program and argument outside to make more logical sense, though not by much. As long as none of the approaches risk breaking stuff, I guess their all reasonable in their own way. – AkselA Oct 12 '17 at 17:16
  • @Pauseduntilfurthernotice. why adding a throw-away value instead of using `$0`? `find *.txt -exec sh -c 'echo "$(sed -n "\$p" "$0"),$0"' {} \;` – Mohammad Alavi Sep 20 '20 at 14:04
  • @MohammadAlavi: It is a pattern that I use consistently. Sometimes it's necessary to get the correct behavior. Compare `sh -c 'echo -e "<$@>\n<$0>\n<$1>\n<$2>"' _ foo bar` (correct output) to the same command without the throwaway `_`. – Dennis Williamson Jun 28 '22 at 14:17
34

Another way is like this:

multiple_cmd() { 
    tail -n1 $1; 
    ls $1 
}; 
export -f multiple_cmd; 
find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;

in one line

multiple_cmd() { tail -1 $1; ls $1 }; export -f multiple_cmd; find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;
  • "multiple_cmd()" - is a function
  • "export -f multiple_cmd" - will export it so any other subshell can see it
  • "find *.txt -exec bash -c 'multiple_cmd "$0"' {} \;" - find that will execute the function on your example

In this way multiple_cmd can be as long and as complex, as you need.

Hope this helps.

Souvik Ghosh
  • 4,456
  • 13
  • 56
  • 78
al3x2ndru
  • 598
  • 5
  • 8
23

There's an easier way:

find ... | while read -r file; do
    echo "look at my $file, my $file is amazing";
done

Alternatively:

while read -r file; do
    echo "look at my $file, my $file is amazing";
done <<< "$(find ...)"
Camilo Martin
  • 37,236
  • 20
  • 111
  • 154
  • 2
    filenames can have newlines in them, this is why find has the -print0 argument and xargs has the -0 argument – abasterfield Nov 30 '16 at 21:37
  • 3
    @abasterfield I always hope never to find those in the wild lol – Camilo Martin Dec 06 '16 at 22:56
  • 1
    what I wanted to do was "find ... -exec zcat {} | wc -l \;" which didn't work. However, find ... | while read -r file; do echo "$file: `zcat $file | wc -l`"; done does work, so thank you! – Greg Dougherty Aug 10 '17 at 12:37
  • In comment above I have "back ticks" around "zcat $file | wc -l". Unfortunately SO turns those into formatting, so I've added it as an actual answer with the correct code visible – Greg Dougherty Aug 10 '17 at 12:41
  • 1
    @GregDougherty You can escape the backticks `\`` to do that you use backslashes like so: `\​\`` (still, that's another good reason to use `$()` instead). – Camilo Martin Aug 14 '17 at 05:48
12

Extending @Tinker's answer,

In my case, I needed to make a command | command | command inside the -exec to print both the filename and the found text in files containing a certain text.

I was able to do it with:

find . -name config -type f \( -exec  grep "bitbucket" {} \; -a -exec echo {} \;  \) 

the result is:

    url = git@bitbucket.org:a/a.git
./a/.git/config
    url = git@bitbucket.org:b/b.git
./b/.git/config
    url = git@bitbucket.org:c/c.git
./c/.git/config
user9869932
  • 6,571
  • 3
  • 55
  • 49
  • 3
    You can also print the filename and the grep'd content on a single line by passing `/dev/null` as a second argument to the `grep` command with one `-exec` parameter: ```find . -name config -type f -exec grep "bitbucket" {} /dev/null \;``` – Bill Feth Mar 24 '20 at 14:56
  • In this case, you could do: `$ find . -name config -type f -exec grep -nl "bitbucket" {} \;` And it will only print the name of the files that matches – Paulo Henrique Lellis Gonalves Sep 02 '21 at 10:02
  • FYI: `grep -H` does exactly that. Prints the file name along with the matching line. – Emsi Mar 09 '23 at 12:02
9

I don't know if you can do this with find, but an alternate solution would be to create a shell script and to run this with find.

lastline.sh:

echo $(tail -1 $1),$1

Make the script executable

chmod +x lastline.sh

Use find:

find . -name "*.txt" -exec ./lastline.sh {} \;
Andrea Spadaccini
  • 12,378
  • 5
  • 40
  • 54
  • 8
    backticks are deprecated, please encourage the usage of $(...) which is better readable, fontindependently, and easy to nest. Thank you. – user unknown Mar 12 '11 at 18:41
7

Thanks to Camilo Martin, I was able to answer a related question:

What I wanted to do was

find ... -exec zcat {} | wc -l \;

which didn't work. However,

find ... | while read -r file; do echo "$file: `zcat $file | wc -l`"; done

does work, so thank you!

Greg Dougherty
  • 3,281
  • 8
  • 35
  • 58
5

1st answer of Denis is the answer to resolve the trouble. But in fact it is no more a find with several commands in only one exec like the title suggest. To answer the one exec with several commands thing we will have to look for something else to resolv. Here is a example:

Keep last 10000 lines of .log files which has been modified in the last 7 days using 1 exec command using severals {} references

1) see what the command will do on which files:

find / -name "*.log" -a -type f -a -mtime -7 -exec sh -c "echo tail -10000 {} \> fictmp; echo cat fictmp \> {} " \;

2) Do it: (note no more "\>" but only ">" this is wanted)

find / -name "*.log" -a -type f -a -mtime -7 -exec sh -c "tail -10000 {} > fictmp; cat fictmp > {} ; rm fictmp" \;

4

I usually embed the find in a small for loop one liner, where the find is executed in a subcommand with $().

Your command would look like this then:

for f in $(find *.txt); do echo "$(tail -1 $f), $(ls $f)"; done

The good thing is that instead of {} you just use $f and instead of the -exec … you write all your commands between do and ; done.

Not sure what you actually want to do, but maybe something like this?

for f in $(find *.txt); do echo $f; tail -1 $f; ls -l $f; echo; done
Johannes Braunias
  • 3,135
  • 1
  • 13
  • 8
  • 4
    It's worth noting that according to ShellCheck it's not the best solution - `SC2044: For loops over find output are fragile. Use find -exec or a while read loop.` There is a great example and description on ShellCheck wiki https://github.com/koalaman/shellcheck/wiki/Sc2044 – Alex Baranowski May 18 '21 at 15:08
  • 1
    This also is exactly what [BashPitfalls #1](https://mywiki.wooledge.org/BashPitfalls#for_f_in_.24.28ls_.2A.mp3.29) advises against. – Charles Duffy Nov 30 '21 at 16:54
3

I found this solution (maybe it is already said in a comment, but I could not find any answer with this)

you can execute MULTIPLE COMMANDS in a row using "bash -c"

find . <SOMETHING> -exec bash -c "EXECUTE 1 && EXECUTE 2 ; EXECUTE 3" \;

in your case

find . -name "*.txt" -exec bash -c "tail -1 '{}' && ls '{}'" \;

i tested it with a test file:

 [gek@tuffoserver tmp]$ ls *.txt


casualfile.txt
[gek@tuffoserver tmp]$ find . -name "*.txt" -exec bash -c "tail -1 '{}' && ls '{}'" \;
testonline1=some TEXT
./casualfile.txt
tuffo19
  • 310
  • 3
  • 8
1

should use xargs :)

find *.txt -type f -exec tail -1 {} \; | xargs -ICONSTANT echo $(pwd),CONSTANT

another one (working on osx)

find *.txt -type f -exec echo ,$(PWD) {} + -exec tail -1 {} + | tr ' ' '/'
smapira
  • 87
  • 7
  • 3
    This overlooks a major use case for `find` - situations where the number of matching files is too large for a command line. `-exec` is a way to get around this limit. Piping out to a utility misses that benefit. – Chris Johnson Jan 27 '17 at 14:32
  • 1
    `xargs -n` exists to choose the number of matches per invocation. `xargs -n 1 foocmd` will execute `foocmd {}` for every match. – AndrewF Apr 03 '19 at 23:27
1

A find+xargs answer.

The example below finds all .html files and creates a copy with the .BAK extension appended (e.g. 1.html > 1.html.BAK).

Single command with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} cp -- "{}" "{}.BAK"

Multiple commands with multiple placeholders

find . -iname "*.html" -print0 | xargs -0 -I {} echo "cp -- {} {}.BAK ; echo {} >> /tmp/log.txt" | sh

# if you need to do anything bash-specific then pipe to bash instead of sh

This command will also work with files that start with a hyphen or contain spaces such as -my file.html thanks to parameter quoting and the -- after cp which signals to cp the end of parameters and the beginning of the actual file names.

-print0 pipes the results with null-byte terminators.


for xargs the -I {} parameter defines {} as the placeholder; you can use whichever placeholder you like; -0 indicates that input items are null-separated.

ccpizza
  • 28,968
  • 18
  • 162
  • 169
  • `xargs -I{} sh -c '...{}...'` has major security problems, and `xargs -I{} echo '...{}...' | sh` is just as bad. What happens when you get a filename that contains `$(/tmp/evil)` in its name as literal text? (Yes, every character in that string is valid in a filename). Or `$(rm -rf ~)'$(rm -rf ~)'` -- yes, again, single quotes can exist in filenames on UNIX. – Charles Duffy Nov 30 '21 at 16:55
  • 1
    The _safe_ thing is to keep your names out-of-band from your code. `find ... -exec bash -c 'for arg; do something_with "$arg"; done' _ {} +` keeps the fienames as arguments, out-of-band from the string interpreted by the shell as code. – Charles Duffy Nov 30 '21 at 16:56
0

Here is my bash script that you can use to find multiple files and then process them all using a command.

Example of usage. This command applies a file linux command to each found file:

./finder.sh file fb2 txt

Finder script:

# Find files and process them using an external command.
# Usage:
#   ./finder.sh ./processing_script.sh txt fb2 fb2.zip doc docx

counter=0
find_results=()
for ext in "${@:2}"
do
    # @see https://stackoverflow.com/a/54561526/10452175
    readarray -d '' ext_results < <(find . -type f -name "*.${ext}" -print0)

    for file in "${ext_results[@]}"
    do
        counter=$((counter+1))
        find_results+=("${file}")
        echo ${counter}") ${file}"
    done
done
countOfResults=$((counter))
echo -e "Found ${countOfResults} files.\n"


echo "Processing..."
counter=0
for file in "${find_results[@]}"
do
    counter=$((counter+1))
    echo -n ${counter}"/${countOfResults}) "
    eval "$1 '${file}'"
done
echo "All files have been processed."

James Bond
  • 2,229
  • 1
  • 15
  • 26