288

I'm grepping through a large pile of code managed by git, and whenever I do a grep, I see piles and piles of messages of the form:

> grep pattern * -R -n
whatever/.git/svn: No such file or directory

Is there any way I can make those lines go away?

anatoly techtonik
  • 19,847
  • 9
  • 124
  • 140
alexgolec
  • 26,898
  • 33
  • 107
  • 159
  • 1
    These days I'd recommend using `ag`, `ack`, or `cgrep` instead - they're much faster/better than `grep` for searching code repositories. – lunixbochs Aug 03 '14 at 17:06
  • If you're grepping through code and looking to avoid particular directories, perhaps you should look at ack. It's a source-code aware grep, and as such will actively ignore such VCS directories (as well as vi and emacs backups, non-source files etc.). – Brian Agnew Oct 09 '15 at 09:07
  • 4
    How can a user get `No such file or directory` messages for files and/or directories that exist? Or, conversely, how can `grep *` be getting names of files that don't exist? Is this a race condition, where some other process manipulates the directory tree (creating, renaming and deleting files) while the `grep` is running? – Scott - Слава Україні Jun 02 '17 at 00:25

12 Answers12

400

You can use the -s or --no-messages flag to suppress errors.

-s, --no-messages suppress error messages

grep pattern * -s -R -n
Dogbert
  • 212,659
  • 41
  • 396
  • 397
  • 25
    @Alex @Dogbert This does answer the question, but '-s' can mask problems, e.g. when you use xargs with grep. Try creating 2 files in a dir, 'aaa.txt' and 'a b.txt', both containing the string 'some text'. The command `/bin/ls -1 | xargs grep 'some text'` will give you "no such file or directory" because it breaks up 'a b.txt' into 2 args. If you suppress, you won't notice you missed a file. – Kelvin Jun 21 '11 at 21:26
  • @Kelvin does, e.g. if I use `find` and use `print0` with `xargs -0` Does that solve the issue? Thanks – Luka Mar 12 '18 at 01:00
  • 2
    @Luka That should solve the issue. You won't run into problems if you always use those NUL options, but if you don't, it's almost guaranteed (IMHO) that you'll forget at the most inopportune time. – Kelvin Mar 12 '18 at 20:20
  • this works on Mac OS X where other options (--quiet) do not – philshem Feb 28 '19 at 17:10
  • @Luka if you use find you can omit xargs and use -exec from find (if you need shell facilities you can wrap you command with your favorite shell) – Et7f3XIV Dec 27 '20 at 03:11
  • @Et7f3XIV Since this comment till this day I developed a habit of always using null. I'm a bigger fan of xargs and null than exec under find. Greater flexibility. – Luka Dec 27 '20 at 20:09
72

If you are grepping through a git repository, I'd recommend you use git grep. You don't need to pass in -R or the path.

git grep pattern

That will show all matches from your current directory down.

Steve Prentice
  • 23,230
  • 11
  • 54
  • 55
  • 7
    +1 for the useful git-specific command. Won't work for svn though :-) – cadrian Jun 21 '11 at 13:58
  • 2
    +1 This the git command I've been missing - this lets me grep for a string from the state of the tree in any commit (by adding the commit after "pattern"). – Kelvin Jun 21 '11 at 21:38
  • 1
    With the fugitive plugin, `Ggrep` also searches starting from the top of the Git directory instead of current directory. – Ciro Santilli OurBigBook.com Mar 10 '16 at 22:48
  • This appears to be significantly faster than standard grep. (Perhaps it ignores binary files, etc? No idea, but useful.) – Daniel Mar 24 '20 at 14:12
18

Errors like that are usually sent to the "standard error" stream, which you can pipe to a file or just make disappear on most commands:

grep pattern * -R -n 2>/dev/null
lunixbochs
  • 21,757
  • 2
  • 39
  • 47
6

I have seen that happening several times, with broken links (symlinks that point to files that do not exist), grep tries to search on the target file, which does not exist (hence the correct and accurate error message).

I normally don't bother while doing sysadmin tasks over the console, but from within scripts I do look for text files with "find", and then grep each one:

find /etc -type f -exec grep -nHi -e "widehat" {} \;

Instead of:

grep -nRHi -e "widehat" /etc
j0k
  • 22,600
  • 28
  • 79
  • 90
Isaac Uribe
  • 61
  • 1
  • 1
4

I usually don't let grep do the recursion itself. There are usually a few directories you want to skip (.git, .svn...)

You can do clever aliases with stances like that one:

find . \( -name .svn -o -name .git \) -prune -o -type f -exec grep -Hn pattern {} \;

It may seem overkill at first glance, but when you need to filter out some patterns it is quite handy.

cadrian
  • 7,332
  • 2
  • 33
  • 42
4

Have you tried the -0 option in xargs? Something like this:

ls -r1 | xargs -0 grep 'some text'
Kjuly
  • 34,476
  • 22
  • 104
  • 118
cokedude
  • 379
  • 1
  • 11
  • 21
3

Use -I in grep.

Example: grep SEARCH_ME -Irs ~/logs.

Community
  • 1
  • 1
Bala
  • 55
  • 1
  • 2
    `-I` skips binary files - it's equivalent to `--binary-files=without-match`. It doesn't suppress "No such file or directory" messages though. – mwfearnley Aug 11 '16 at 08:07
  • If the errors are caused by files/dirs with spaces, the `-I` is a good solution. For example `find . -type f -name "*.txt" | xargs -I{} grep "search_str" "{}"` – pards Jun 07 '22 at 13:11
3

I redirect stderr to stdout and then use grep's invert-match (-v) to exclude the warning/error string that I want to hide:

grep -r <pattern> * 2>&1 | grep -v "No such file or directory"
talleyho
  • 31
  • 1
0

I was getting lots of these errors running "M-x rgrep" from Emacs on Windows with /Git/usr/bin in my PATH. Apparently in that case, M-x rgrep uses "NUL" (the Windows null device) rather than "/dev/null". I fixed the issue by adding this to .emacs:

;; Prevent issues with the Windows null device (NUL)
;; when using cygwin find with rgrep.
(defadvice grep-compute-defaults (around grep-compute-defaults-advice-null-device)
  "Use cygwin's /dev/null as the null-device."
  (let ((null-device "/dev/null"))
    ad-do-it))
(ad-activate 'grep-compute-defaults)
beaslera
  • 863
  • 9
  • 19
0

One easy way to make grep return zero status all the time is to use || true

 → echo "Hello" | grep "This won't be found" || true

 → echo $?
   0

As you can see the output value here is 0 (Success)

Nikhil JSK
  • 15
  • 6
0

Many answers, but none works, sigh

 grep "nick-banner" *.html -R -n

It does not work:

grep: *.html: No such file or directory
Teemu
  • 91
  • 1
  • 3
0

Problem:

This drove me bananas. I tried everything under the (Google) Sun and nothing worked with this grep which just puked repeated errors about "sysctl: reading key ..." before finally printing the match:

sudo sysctl -a | grep vm.min_free_kbytes

Solution:

Nothing worked UNTIL I had an epiphany: What if I filtered in the front rather than at the back?... Yup: that worked:

sysctl -a --ignore 2>/dev/null | grep vm.min_free_kbytes

Conclusion:

Obviously not every command will have the --ignore switch, but it's an example of how I got around the problem filtering BEFORE my grep. Don't get so blinkered you chase your tail pursuing something that won't work ;-)

F1Linux
  • 3,580
  • 3
  • 25
  • 24