18

This is the command I've been using for finding matches (queryString) in php files, in the current directory, with grep, case insensitive, and showing matching results in line:

find . -iname "*php" -exec grep -iH queryString {} \;

Is there a way to also pipe just the file name of the matches to another script?

I could probably run the -exec command twice, but that seems inefficient.

What I'd love to do on Mac OS X is then actually to "reveal" that file in the finder. I think I can handle that part. If I had to give up the inline matches and just let grep show the files names, and then pipe that to a third script, that would be fine, too - I would settle.

But I'm actually not even sure how to pipe the output (the matched file names) to somewhere else...

Help! :)

Clarification

I'd like to reveal each of the files in a finder window - so I'm probably not going to using the -q flag and stop at the first one.

I'm going to run this in the console, ideally I'd like to see the inline matches printed out there, as well as being able to pipe them to another script, like oascript (applescript, to reveal them in the finder). That's why I have been using -H - because I like to see both the file name and the match.

If I had to settle for just using -l so that the file name could more easily be piped to another script, that would be OK, too. But I think after looking at the reply below from @Charlie Martin, that xargs could be helpful here in doing both at the same time with a single find, and single grep command.

I did say bash but I don't really mind if this needs to be ran as /bin/sh instead - I don't know too much about the differences yet, but I do know there are some important ones.

Thank you all for the responses, I'm going to try some of them at the command line and see if I can get any of them to work and then I think I can choose the best answer. Leave a comment if you want me to clarify anything more.

Thanks again!

Cole Busby
  • 398
  • 6
  • 17
cwd
  • 53,018
  • 53
  • 161
  • 198

9 Answers9

26

You bet. The usual thing is something like

  $ find /path -name pattern -print | xargs command

So you might for example do

  $ find . -name '*.[ch]' -print | xargs grep -H 'main' 

(Quiz: why -H?)

You can carry on with this farther; for example. you might use

  $ find . -name '*.[ch]' -print | xargs grep -H 'main' | cut -d ':' -f 1

to get the vector of file names for files that contain 'main', or

  $ find . -name '*.[ch]' -print | xargs grep -H 'main' | cut -d ':' -f 1 |
      xargs growlnotify -

to have each name become a Growl notification.

You could also do

 $ grep pattern `find /path -name pattern`

or

 $ grep pattern $(find /path -name pattern)

(in bash(1) at least these are equivalent) but you can run into limits on the length of a command line that way.

Update

To answer your questions:

(1) You can do anything in bash you can do in sh. The one thing I've mentioned that would be any different is the use of $(command) in place of using backticks around command, and that works in the version of sh on Macs. The csh, zsh, ash, and fish are different.

(2) I think merely doing $ open $(dirname arg) will opena finder window on the containing directory.

Charlie Martin
  • 110,348
  • 25
  • 193
  • 263
  • 1
    `$()` is posix. Couple of issues with these. If you are piping to xargs, you should really be using `find ... -print0 | xargs -0` to safely process filenames with newlines. `grep pattern $(find ...)` is just a really bad syntax in general and shouldn't be suggested. Yes, you noted it may hit `ARG_MAX`, but it will also simply fail for any files containing whitespace. You could instead [loop over the results of find](http://stackoverflow.com/a/7039579/3076724), but there's really no need for pipes or a loop here, as I noted in my answer. – Reinstate Monica Please Oct 23 '14 at 22:46
  • And believe it or not, there are still shells that aren't POSIX. People who make filenames with newlines should be beaten severely. But you're right, after the beating you're still going to need the `--print0` trick. – Charlie Martin Oct 24 '14 at 16:34
  • @CharlieMartin It would be more of an issue with people purposefully trying to cause trouble, i.e. someone does `touch $'something\n/etc/passwd\nfoo.php'` and your `command` is `rm`. But agree it's not really a huge issue, and `command $(other command with potentially complicated output)` is far more likely to cause problems. – Reinstate Monica Please Oct 24 '14 at 21:12
  • @BroSlow, those people should be severely beaten *twice*. – Charlie Martin Nov 02 '14 at 18:36
5

It sounds like you want to open all *.php files that contain querystring from within a Terminal.app session.

You could do it this way:

find . -name '*.php' -exec grep -li 'querystring' {} \; | xargs open

With my setup, this opens MacVim with each file on a separate tab. YMMV.

johnsyweb
  • 136,902
  • 23
  • 188
  • 247
4

Replace -H with -l and you will get a list of those filenames that matched the pattern.

DigitalRoss
  • 143,651
  • 25
  • 248
  • 329
3

if you have bash4, simply do

grep pattern /path/**/*.php

the ** operator is like

grep pattern `find -name \*.php -print`
clt60
  • 62,119
  • 17
  • 107
  • 194
2
find /home/aaronmcdaid/Code/ -name '*.cpp' -exec grep -q -iH boost {} \; -exec echo {} \;

The first change I made is to add -q to your grep command. This is "Exit immediately with zero status if any match is found".

The good news is that this speeds up grep when a file has many matching lines. You don't care how many matches there are. But that means we need another exec on the end to actually print the filenames when grep has been successful

Aaron McDaid
  • 26,501
  • 9
  • 66
  • 88
  • @DigitalRoss has the more concise answer, equivalent in behaviour to mine. Vote for it instead! – Aaron McDaid May 21 '11 at 00:51
  • In GNU grep, the following arguments are equivalent: `-q, --quiet, --silent`. I think you meant `-l, --files-with-matches`, which will print the name of each input file from which output would normally have been printed. The scanning will stop on the first match. – johnsyweb May 21 '11 at 00:53
  • @Johnsyweb I did intend `-q`. I've just updated my answer to more fully explian my command line. As I already said, I defer to @DigitalRoss's answer, but I still feel mine is technically correct. I use `-q` to test for the match, and a second `-exec` to actually print the filenames. – Aaron McDaid May 21 '11 at 00:57
1

Pipe to another script:

 find . -iname "*.php" | myScript

File names will come into the stdin of myScript 1 line at a time.

You can also use xargs to form/execute commands to act on each file:

 find . -iname "*.php" | xargs ls -l

act on files you find that match:

find . -iname "*.php" | xargs grep -l pattern | myScript

act that don't match pattern

find . -iname "*.php" | xargs grep -L pattern | myScript
Andrew
  • 2,530
  • 16
  • 9
1

The grep result will be sent to stdout, so another -exec predicate is probably the best solution here.

Ignacio Vazquez-Abrams
  • 776,304
  • 153
  • 1,341
  • 1,358
0

In general using multiple -exec's and grep -q will be FAR faster than piping, since find has implied short circuits -a's separating each juxtaposed pair of expressions that's not separated with an explicit operator. The main problem here, is that you want something to happen if grep matches something AND for matches to be printed. If the files are reasonably sized then this should be faster (because grep -q exits after finding a single match)

find . -iname "*php" -exec grep -iq queryString {} \; -exec grep -iH queryString {} \; -exec otherprogram {} \; 

If the files are particularly big, encapsulating it in a shell script may be faster then running multiple grep commands

find . -iname "*php" -exec bash -c \
  'out=$(grep -iH queryString "$1"); [[ -n $out ]] && echo "$out" && exit 0 || exit 1' \
bash {} \; -print 

Also note, if the matches are not particularly needed, then

find . -iname "*php" -exec grep -iq queryString {} \; -exec otherprogram {} \;

Will virtually always be faster than then a piped solution like

find . -iname "*php" -print0 | xargs -0 grep -iH | ...

Additionally, you should really have -type f in all cases, unless you want to catch *php directories

Reinstate Monica Please
  • 11,123
  • 3
  • 27
  • 48
0

Regarding the question of which is faster, and you actually care about the minuscule time difference, which maybe you might if you are trying to see which will save your processor some time... perhaps testing using the command as a suffix to the "time" command, and see which one performs better.

Rick
  • 1
  • You are not answering the original question, for which another answer has already been accepted. If you are making a contribution to the accepted answer, please make it clearer. Otherwise your post should rather be a comment. – RaphaMex Dec 21 '17 at 03:47