2118

How do I recursively grep all directories and subdirectories?

find . | xargs grep "texthere" *
Geoffrey Hale
  • 10,597
  • 5
  • 44
  • 45
wpiri
  • 21,337
  • 3
  • 17
  • 6
  • 141
    @TC1 The sad thing is that grep itself can answer the question (at least GNU grep): grep --help |grep recursive – Frank Schmitt Oct 25 '13 at 14:42
  • 12
    If you find yourself frequently using grep to do recursive searches (especially if you manually do a lot of file/directory exlusions), you may find [ack](http://beyondgrep.com/) (a very programmer-friendly grep alternative) useful. – Nick McCurdy Oct 25 '13 at 20:56
  • 26
    Actually neither -r nor --recursive work on the Solaris box I use at work. And the man page for grep doesn't mention anything recursive. I had to resort to find and xargs myself. – Ben Jan 09 '14 at 15:59
  • 9
    ag is my favorite way to do this now https://github.com/ggreer/the_silver_searcher – dranxo May 21 '14 at 23:11
  • 2
    `grep -rin xlsx *.pl` doesn't work for me on Redhat Linux. I get a "no match" error. – Bulrush Sep 15 '15 at 18:43
  • 1
    The * on the end is a mistake isn't it? The shell will expand that to add all the files and directories in the current directory as parameters to the to the grep command, rather than allowing only xargs to provide the file name – Neil Stevens Jul 15 '16 at 10:50
  • 1
    If your gonna mention Silver Searcher (ag) then you gotta mention RipGrep: https://news.ycombinator.com/item?id=12564442 – AAAfarmclub Oct 24 '17 at 22:01
  • 1
    The final "*" is not needed, xargs feeds the filenames to the grep command in suitable chunks not to overload the max allowed number of characters for a command line. – ennox May 29 '19 at 12:30
  • 1
    recursively in this context means that we search inside folders inside the folder we are standing in? – Brainmaniac Dec 13 '20 at 09:25

27 Answers27

3064
grep -r "texthere" .

The first parameter represents the regular expression to search for, while the second one represents the directory that should be searched. In this case, . means the current directory.

Note: This works for GNU grep, and on some platforms like Solaris you must specifically use GNU grep as opposed to legacy implementation. For Solaris this is the ggrep command.

Greg Bacon
  • 134,834
  • 32
  • 188
  • 245
Vinko Vrsalovic
  • 330,807
  • 53
  • 334
  • 373
  • 46
    Note: "grep -r" only works on newer greps. It doesn't work on the grep that comes with `AIX 5.3` for example. – Withheld Feb 01 '13 at 13:09
  • 138
    Use grep -R to follow symlinks. – Eloff Apr 05 '13 at 23:01
  • 5
    On Cygwin, to also search for hidden files (that begin with a dot) and inside them, you need to use `find . -type f | xargs grep -l "search string"`. `grep -r` alone on Cygwin does not search hidden files. – pbies Jun 03 '14 at 11:23
  • 72
    It is good to know that "-i" would make it case insensitive, and "-n" also include the line number for each matched result. – Sadegh Jan 23 '15 at 12:02
  • 41
    also good to know, if you are just looking for a fixed string and not a regex, use -F option. it will save you scads of time by not invoking the regex parser. very handy if you are searching lots of files. – Jeff May 06 '15 at 17:20
  • 7
    alias rgrep='grep -r' – totten Mar 21 '16 at 16:38
  • 2
    This should not be the accepted answer, because the question has the tag "unix" and it does not work for a lot of Unixes, including HP-UX and AIX (it should work on Linuxes though). Instead, those OS need to use the find command like explained in https://unix.stackexchange.com/a/24917/141272 – Gabriel Hautclocq Aug 31 '17 at 15:40
  • 2
    What happens if the `.` was omitted? – CyberMew Feb 15 '18 at 04:13
  • 1
    You can use `-o` to only output the matching text. This is helpful if you have really long lines in the files you're searching (like minified code) and are mostly interested in the files, not the surrounding context. – Joshua Pinter Jun 18 '18 at 01:42
  • 1
    in .bash_aliases put rgrepi() { grep -i -r --include "${2:-*}" $1 ${3:-.} } – yoyoma2 Aug 14 '19 at 16:48
  • 2
    When doing this recursively, I often want to know the file where the contents are. You can get this by adding option `--with-filename`. – xmar Sep 23 '19 at 07:50
  • 1
    What if i want grep to keep polling a file so that when the specific text i am looking for is written to it, grep can find it? – AlphaGoku Feb 20 '20 at 09:26
  • are the quotations needed? perhaps it would be nice to add that. – Charlie Parker Aug 20 '21 at 20:04
  • So, typically we want something like `grep -RiFn3 "texthere" .` – Nathan majicvr.com Mar 20 '23 at 21:46
849

If you know the extension or pattern of the file you would like, another method is to use --include option:

grep -r --include "*.txt" texthere .

You can also mention files to exclude with --exclude.

Ag

If you frequently search through code, Ag (The Silver Searcher) is a much faster alternative to grep, that's customized for searching code. For instance, it's recursive by default and automatically ignores files and directories listed in .gitignore, so you don't have to keep passing the same cumbersome exclude options to grep or find.

Dan Dascalescu
  • 143,271
  • 52
  • 317
  • 404
christangrant
  • 9,276
  • 4
  • 26
  • 27
  • 3
    Works great with grep that comes with Linux & Cygwin, but not with the one that comes with AIX. – Withheld Jan 31 '13 at 20:08
  • 1
    @KrzysztofWolny: ` ` instead of `=` works just fine on Ubuntu. PS: that's supposed to be a backticked space, but the SO markdown parser failed. – Dan Dascalescu Feb 19 '14 at 09:08
  • 7
    @DanDascalescu I upvoted for the `grep`, not for the Ag, just so you know :) – Bernhard May 15 '14 at 07:24
  • 1
    Do we have an option to exclude a directory while searching recursively? – Tom Taylor Sep 24 '17 at 15:47
  • Windows **cygwin** likes double-quotes `--include "*.txt" --include "*.TXT"` – Bob Stein Feb 19 '19 at 16:48
  • In .bash_aliases put the one-liner `rgrepi() { grep -i -r --include "${2:-*}" $1 ${3:-.} }` Then you can do: `rgrepi texthere ; rgrepi texthere '*.txt' ; rgrepi texthere '*.txt' ~/Downloads` ; to respectively recursively search everything ; recursively search a file pattern ; recursively search a file pattern in a specific directory – yoyoma2 Aug 14 '19 at 17:07
  • There is an `--exclude-dir` option for excluding directories when searching recursively – Jonathan Holvey Jan 22 '20 at 06:51
161

I now always use (even on Windows with GoW -- Gnu on Windows):

grep --include="*.xxx" -nRHI "my Text to grep" *

(As noted by kronen in the comments, you can add 2>/dev/null to void permission denied outputs)

That includes the following options:

--include=PATTERN

Recurse in directories only searching file matching PATTERN.

-n, --line-number

Prefix each line of output with the line number within its input file.

(Note: phuclv adds in the comments that -n decreases performance a lot so, so you might want to skip that option)

-R, -r, --recursive

Read all files under each directory, recursively; this is equivalent to the -d recurse option.

-H, --with-filename

Print the filename for each match.

-I     

Process a binary file as if it did not contain matching data;
this is equivalent to the --binary-files=without-match option.

And I can add 'i' (-nRHIi), if I want case-insensitive results.

I can get:

/home/vonc/gitpoc/passenger/gitlist/github #grep --include="*.php" -nRHI "hidden" *
src/GitList/Application.php:43:            'git.hidden'      => $config->get('git', 'hidden') ? $config->get('git', 'hidden') : array(),
src/GitList/Provider/GitServiceProvider.php:21:            $options['hidden'] = $app['git.hidden'];
tests/InterfaceTest.php:32:        $options['hidden'] = array(self::$tmpdir . '/hiddenrepo');
vendor/klaussilveira/gitter/lib/Gitter/Client.php:20:    protected $hidden;
vendor/klaussilveira/gitter/lib/Gitter/Client.php:170:     * Get hidden repository list
vendor/klaussilveira/gitter/lib/Gitter/Client.php:176:        return $this->hidden;
...
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
  • Gow looks promising - newer than the GNU Windows utilities that I have been using. Trying it now... – Radim Cernej Jan 23 '16 at 00:16
  • 2
    what is the meaning of the last character * here? – lorniper Aug 04 '16 at 07:07
  • 2
    @lorniper it makes the shell select all files and folders in your current directory, making in turn the grep apply to those files and (recursively because of the `-R` option) to the folders. – VonC Aug 04 '16 at 07:10
  • 2
    @lorniper Noy exactly: `*` or `.` is a glob pattern (interpreted by the shell): http://unix.stackexchange.com/a/64695/7490. '`.`' will select dotfiles or dot folders as well (like `.git/`) – VonC Aug 04 '16 at 07:22
  • previously I've always used `grep -rnI` but then I learned that [`-n` decreases performance a lot](https://stackoverflow.com/a/12630617/995714) so I just use it when really needed and normally I'll use `-rI` – phuclv Feb 03 '19 at 03:33
  • @phuclv Thank you. I have included your comment in the answer for more visibility. – VonC Feb 03 '19 at 03:45
  • I would add `2>/dev/null` to void permission denied outputs – Kronen Mar 16 '21 at 14:08
  • @Kronen Thank you, good point. I have included your comment in the answer for more visibility. – VonC Mar 16 '21 at 15:28
140

Also:

find ./ -type f -print0 | xargs -0 grep "foo"

but grep -r is a better answer.

Iulian Onofrei
  • 9,188
  • 10
  • 67
  • 113
  • 14
    Or if you don't want to worry about spaces in filenames `find . -type f -exec grep "foo" '{}' \;` works well where supported. – Edd Steel Dec 31 '11 at 19:42
  • 4
    If you are going to pipe find through xargs to grep, AND if you are only searching for a fixed string (i.e., not a regex), you might benefit from invoking the grep -F option, so grep won't load the regex engine for each invocation. If there are a lot of files it will be much faster. – Jeff Apr 19 '13 at 16:58
  • 2
    find . -type f -exec grep -Hu "foo" {} \; is what I use as it gives the filename. – Wes Aug 27 '13 at 08:48
  • This works on all *nix because it is [POSIX 7](http://pubs.opengroup.org/onlinepubs/9699919799/utilities/contents.html) – Ciro Santilli OurBigBook.com Feb 16 '14 at 13:31
  • 2
    `find ./ -type f -print0 | xargs -0 grep "foo"` – aehlke Jul 02 '14 at 16:49
  • this was actually a beter solution for me because grep -r took forever. This might be because grep -r was searching hidden files (of which i had a ton) and maybe find was skipping them. find also, showed the file name next to each found line, which was important to me – yosefrow Feb 01 '17 at 10:33
  • @EddSteel Is there a particular reason you have quoted the placeholder like so: `'{}'`? On GNU and FreeBSD find it some seems to handle spaces just fine either way. – Harold Fischer Sep 11 '18 at 02:39
  • @HaroldFischer just reflex/paranoia :) – Edd Steel Oct 19 '18 at 04:33
33

globbing **

Using grep -r works, but it may overkill, especially in large folders.

For more practical usage, here is the syntax which uses globbing syntax (**):

grep "texthere" **/*.txt

which greps only specific files with pattern selected pattern. It works for supported shells such as Bash +4 or zsh.

To activate this feature, run: shopt -s globstar.

See also: How do I find all files containing specific text on Linux?

git grep

For projects under Git version control, use:

git grep "pattern"

which is much quicker.

ripgrep

For larger projects, the quickest grepping tool is ripgrep which greps files recursively by default:

rg "pattern" .

It's built on top of Rust's regex engine which uses finite automata, SIMD and aggressive literal optimizations to make searching very fast. Check the detailed analysis here.

kenorb
  • 155,785
  • 88
  • 678
  • 743
27

In POSIX systems, you don't find -r parameter for grep and your grep -rn "stuff" . won't run, but if you use find command it will:

find . -type f -exec grep -n "stuff" {} \; -print

Agreed by Solaris and HP-UX.

rook
  • 5,880
  • 4
  • 39
  • 51
  • 1
    what is the meaning of {} \; -print respectively? – user1169587 Apr 27 '16 at 06:43
  • 4
    In `-exec` option - symbol `{}` is a reference to the filename which currently found by `find` tool (that is to do something with the filename we found), also `-exec` option should be terminated with `;` symbol (to mark ending of the exec commands), but because this is all running in a shell that symbol should be escaped.. and finally `-print` option lets `find` tool to print out found filenames on the screen. – rook Apr 27 '16 at 09:47
13

If you only want to follow actual directories, and not symbolic links,

grep -r "thingToBeFound" directory

If you want to follow symbolic links as well as actual directories (be careful of infinite recursion),

grep -R "thing to be found" directory

Since you're trying to grep recursively, the following options may also be useful to you:

-H: outputs the filename with the line

-n: outputs the line number in the file

So if you want to find all files containing Darth Vader in the current directory or any subdirectories and capture the filename and line number, but do not want the recursion to follow symbolic links, the command would be

grep -rnH "Darth Vader" .

If you want to find all mentions of the word cat in the directory

/home/adam/Desktop/TomAndJerry 

and you're currently in the directory

/home/adam/Desktop/WorldDominationPlot

and you want to capture the filename but not the line number of any instance of the string "cats", and you want the recursion to follow symbolic links if it finds them, you could run either of the following

grep -RH "cats" ../TomAndJerry                   #relative directory

grep -RH "cats" /home/adam/Desktop/TomAndJerry   #absolute directory

Source:

running "grep --help"

A short introduction to symbolic links, for anyone reading this answer and confused by my reference to them: https://www.nixtutor.com/freebsd/understanding-symbolic-links/

SarcasticSully
  • 442
  • 4
  • 14
11

just the filenames can be useful too

grep -r -l "foo" .
chim
  • 8,407
  • 3
  • 52
  • 60
11

To find name of files with path recursively containing the particular string use below command for UNIX:

find . | xargs grep "searched-string"

for Linux:

grep -r "searched-string" .

find a file on UNIX server

find . -type f -name file_name

find a file on LINUX server

find . -name file_name
Girdhar Singh Rathore
  • 5,030
  • 7
  • 49
  • 67
11

another syntax to grep a string in all files on a Linux system recursively

grep -irn "string"

the -r indicates a recursive search that searches for the specified string in the given directory and sub directory looking for the specified string in files, program, etc

-i ingnore case sensitive can be used to add inverted case string

-n prints the line number of the specified string

NB: this prints massive result to the console so you might need to filter the output by piping and remove less interesting bits of info. It also searches binary programs so you might want to filter some of the results

geek
  • 307
  • 2
  • 10
9

ag is my favorite way to do this now github.com/ggreer/the_silver_searcher . It's basically the same thing as ack but with a few more optimizations.

Here's a short benchmark. I clear the cache before each test (cf https://askubuntu.com/questions/155768/how-do-i-clean-or-disable-the-memory-cache )

ryan@3G08$ sync && echo 3 | sudo tee /proc/sys/vm/drop_caches
3
ryan@3G08$ time grep -r "hey ya" .

real    0m9.458s
user    0m0.368s
sys 0m3.788s
ryan@3G08:$ sync && echo 3 | sudo tee /proc/sys/vm/drop_caches
3
ryan@3G08$ time ack-grep "hey ya" .

real    0m6.296s
user    0m0.716s
sys 0m1.056s
ryan@3G08$ sync && echo 3 | sudo tee /proc/sys/vm/drop_caches
3
ryan@3G08$ time ag "hey ya" .

real    0m5.641s
user    0m0.356s
sys 0m3.444s
ryan@3G08$ time ag "hey ya" . #test without first clearing cache

real    0m0.154s
user    0m0.224s
sys 0m0.172s
Community
  • 1
  • 1
dranxo
  • 3,348
  • 4
  • 35
  • 48
6

This should work:

grep -R "texthere" *
sashkello
  • 17,306
  • 24
  • 81
  • 109
sumit kumar
  • 602
  • 3
  • 11
  • 26
6

If you are looking for a specific content in all files from a directory structure, you may use find since it is more clear what you are doing:

find -type f -exec grep -l "texthere" {} +

Note that -l (downcase of L) shows the name of the file that contains the text. Remove it if you instead want to print the match itself. Or use -H to get the file together with the match. All together, other alternatives are:

find -type f -exec grep -Hn "texthere" {} +

Where -n prints the line number.

fedorqui
  • 275,237
  • 103
  • 548
  • 598
  • 2
    Up-voted for being the only `find` solution to both avoid unnecessary use of `xargs` and use `+` instead of `\;` with `-exec`, thereby avoiding tons of unnecessary process launches. :-) – ShadowRanger Jan 30 '16 at 08:08
6

This is the one that worked for my case on my current machine (git bash on windows 7):

find ./ -type f -iname "*.cs" -print0 | xargs -0 grep "content pattern"

I always forget the -print0 and -0 for paths with spaces.

EDIT: My preferred tool is now instead ripgrep: https://github.com/BurntSushi/ripgrep/releases . It's really fast and has better defaults (like recursive by default). Same example as my original answer but using ripgrep: rg -g "*.cs" "content pattern"

arkod
  • 1,973
  • 1
  • 20
  • 20
5

grep -r "texthere" . (notice period at the end)

(^credit: https://stackoverflow.com/a/1987928/1438029)


Clarification:

grep -r "texthere" / (recursively grep all directories and subdirectories)

grep -r "texthere" . (recursively grep these directories and subdirectories)

grep recursive

grep [options] PATTERN [FILE...]

[options]

-R, -r, --recursive

Read all files under each directory, recursively.

This is equivalent to the -d recurse or --directories=recurse option.

http://linuxcommand.org/man_pages/grep1.html

grep help

$ grep --help

$ grep --help |grep recursive
  -r, --recursive           like --directories=recurse
  -R, --dereference-recursive

Alternatives

ack (http://beyondgrep.com/)

ag (http://github.com/ggreer/the_silver_searcher)

Community
  • 1
  • 1
Geoffrey Hale
  • 10,597
  • 5
  • 44
  • 45
5

Throwing my two cents here. As others already mentioned grep -r doesn't work on every platform. This may sound silly but I always use git.

git grep "texthere"

Even if the directory is not staged, I just stage it and use git grep.

Zstack
  • 4,046
  • 1
  • 19
  • 22
4

Below are the command for search a String recursively on Unix and Linux environment.

for UNIX command is:

find . -name "string to be searched" -exec grep "text" "{}" \;

for Linux command is:

grep -r "string to be searched" .
Girdhar Singh Rathore
  • 5,030
  • 7
  • 49
  • 67
  • The use of `find` with `-exec` needs more upvotes over the numerous answers with the non-portable and (slightly) less efficient `-print0 | xargs -0` – tripleee Oct 26 '22 at 11:15
4

In 2018, you want to use ripgrep or the-silver-searcher because they are way faster than the alternatives.

Here is a directory with 336 first-level subdirectories:

% find . -maxdepth 1 -type d | wc -l
     336

% time rg -w aggs -g '*.py'
...
rg -w aggs -g '*.py'  1.24s user 2.23s system 283% cpu 1.222 total

% time ag -w aggs -G '.*py$'
...
ag -w aggs -G '.*py$'  2.71s user 1.55s system 116% cpu 3.651 total

% time find ./ -type f -name '*.py' | xargs grep -w aggs
...
find ./ -type f -name '*.py'  1.34s user 5.68s system 32% cpu 21.329 total
xargs grep -w aggs  6.65s user 0.49s system 32% cpu 22.164 total

On OSX, this installs ripgrep: brew install ripgrep. This installs silver-searcher: brew install the_silver_searcher.

hughdbrown
  • 47,733
  • 20
  • 85
  • 108
  • 2
    Speed is important if you need to do this often, but most of us find ourselves only doing this a few times a year at most. Installing the latest spiffy third-party juju tool du jour is overkill and the solutions which haven't changed much since 1978 are good to know regardless. – tripleee Jul 06 '18 at 15:32
  • I find it highly implausible that a programmer would search for text in a source tree only several times per year. But even from the viewpoint of usability, `rg` has a considerable edge over cobbling together a recursive grep command from scratch. Using `rg`: `rg foo`. Using unix tools: `find . | xargs grep foo`. And if any of your files has a quote in it, you need to use `find . -print0 | xargs -0 grep foo`. Are you going to remember that if you use this a few times a year? – hughdbrown Oct 06 '18 at 02:07
  • 1
    You're forgetting `find . -type f -exec grep 'regex' {} +` which indeed is easy to remember if you use these tools with any regularity. But probably you should run `ctags` or `etags` on your source tree anyway if you need to find stuff frequently. – tripleee Oct 06 '18 at 13:44
  • I've been using ripgrep and it's great. But silver searcher is fantastic for programmers. +1 – hookenz Jul 01 '19 at 22:48
3

In my IBM AIX Server (OS version: AIX 5.2), use:

find ./ -type f -print -exec grep -n -i "stringYouWannaFind" {} \; 

this will print out path/file name and relative line number in the file like:

./inc/xxxx_x.h

2865: /** Description : stringYouWannaFind */

anyway,it works for me : )

user3606336
  • 281
  • 2
  • 4
2

For a list of available flags:

grep --help 

Returns all matches for the regexp texthere in the current directory, with the corresponding line number:

grep -rn "texthere" .

Returns all matches for texthere, starting at the root directory, with the corresponding line number and ignoring case:

grep -rni "texthere" /

flags used here:

  • -r recursive
  • -n print line number with output
  • -i ignore case
JSON C11
  • 11,272
  • 7
  • 78
  • 65
1

Note that find . -type f | xargs grep whatever sorts of solutions will run into "Argument list to long" errors when there are too many files matched by find.

The best bet is grep -r but if that isn't available, use find . -type f -exec grep -H whatever {} \; instead.

m.thome
  • 222
  • 1
  • 6
  • Huh? `xargs` is specifically a workaround for the "Argument list too long" problem. – tripleee Apr 21 '15 at 06:12
  • 2
    Well, no - xargs is _specifically_ for converting a pipe of arguments to an arglist, but yes, it is true that modern xargs _when used with -s and/or -L_ can deal with very long arglists by breaking into multiple command invocations, but it isn't configured that way by default (and wasn't in any of the above responses). As an example: `find . -type f | xargs -L 100 grep whatever` – m.thome Apr 23 '15 at 13:56
  • Which platform would that be on? [POSIX `xargs`](http://pubs.opengroup.org/onlinepubs/009604599/utilities/xargs.html) is standardized to have this behavior out of the box. *"The `xargs` utility shall limit the command line length such that when the command line is invoked, the combined argument and environment lists ... shall not exceed {ARG_MAX}-2048 bytes."* – tripleee Apr 23 '15 at 15:42
  • Hm. While the gnu docs are less clear than posix on this basis, and I no longer have access to the machine that caused me to make this statement, I cannot confirm my original interpretation on any current implementation. Recursive grep is, of course, still preferable if available, but there's little reason to avoid the xargs recipe (do use -H for the grep to avoid the final invocation of grep getting passed only a single filename, though). – m.thome Apr 24 '15 at 17:05
1

I guess this is what you're trying to write

grep myText $(find .)

and this may be something else helpful if you want to find the files grep hit

grep myText $(find .) | cut -d : -f 1 | sort | uniq
  • It's very intuitive: for example: grep -i acc $(find . -name "execution*.*") – Yu Shen Nov 16 '16 at 10:37
  • This runs into several common beginner problems for file names with whitespace in them etc. See https://mywiki.wooledge.org/BashFAQ/020 and https://www.iki.fi/era/unix/award.html#backticks – tripleee Oct 26 '22 at 11:13
1

Here's a recursive (tested lightly with bash and sh) function that traverses all subfolders of a given folder ($1) and using grep searches for given string ($3) in given files ($2):

$ cat script.sh
#!/bin/sh

cd "$1"

loop () {
    for i in *
    do
        if [ -d "$i" ]
        then
            # echo entering "$i"
            cd "$i"
            loop "$1" "$2"
        fi
    done

    if [ -f "$1" ]
    then
        grep -l "$2" "$PWD/$1"
    fi

    cd ..
}

loop "$2" "$3"

Running it and an example output:

$ sh script start_folder filename search_string
/home/james/start_folder/dir2/filename
James Brown
  • 36,089
  • 7
  • 43
  • 59
1

For .gz files, recursively scan all files and directories Change file type or put *

find . -name \*.gz -print0 | xargs -0 zgrep "STRING"
0

Just for fun, a quick and dirty search of *.txt files if the @christangrant answer is too much to type :-)

grep -r texthere .|grep .txt

PJ Brunet
  • 3,615
  • 40
  • 37
0

Get the first matched files from grep command and get all the files don't contain some word, but input files for second grep comes from result files of first grep command.

grep -l -r --include "*.js" "FIRSTWORD" * | xargs grep "SECONDwORD"
grep -l -r --include "*.js" "FIRSTWORD" * | xargs grep -L "SECONDwORD"

dc0fd654-37df-4420-8ba5-6046a9dbe406

grep -l -r --include "*.js" "SEARCHWORD" * | awk -F'/' '{print $NF}' | xargs -I{} sh -c 'echo {}; grep -l -r --include "*.html" -w --include=*.js -e {} *;  echo '''

5319778a-cec2-444d-bcc4-53d33821fedb

grep "SEARCH_STRING" *.log | grep -e "http" -e "https" | awk '{print $NF}' | uniq

ce91d131-a5c2-4cc8-b836-1461feee6cdb

Here's how you can modify the command to extract the value of messageName:

grep -m 2 "In sendMessage:: " *LOGFILE.log | grep -o -e "messageName=[^,]*" | cut -d= -f2 | sort | uniq | tee >(echo "Number of unique values: $(wc -l)")

grep "In Message:: " *messaging.log | grep -o -e "messageName=[^,]*" | cut -d= -f2 | sort | uniq | while read -r messageName; do grep -m 1 "In  sendMessage:: .*messageName=${messageName}" *logfile.log | head -n 1; done

I want to use run below grep command over above files 2. files to be sorted in descending order based on their update time and not matching the .gz format

grep "org.springframework.batch.item.ItemStreamException: Failed to initialize the reader at" $(ls -lrth | grep -i opti | awk '{print $NF}')
      grep -A 15 "request to URL : SEARCH" $(ls -lth | grep "common" | grep -v ".gz"  | awk '{print $NF}')

command to create a new file from the first occurrence to the last occurrence.

sed -n '/14 Jan 2023/,/14 Jan 2023/p' common.log > common_1day.log

Today modified files,

ls -lrth $(find . -type f -name "*.log" -newermt "$(date -R -d 'today 00:00')" -print)
grep "CID" $(find . -type f -name "*.log" -newermt "$(date -R -d 'today 00:00')" -print)
zgrep "SEARCH" $(find . -type f -newermt "$(date -R -d 'today 00:00')" -print)
ls -lrth $(find . -type f -name "*" -newermt "$(date -R -d 'today 00:00')" -print)
less +G $(find . -type f -name "*LOG_FILE.log" -newermt "$(date -R -d 'today 00:00')" -print)
grep Async $(find . -type f -name "*" -newermt "2023-04-14 00:00:00" ! -newermt "2023-04-16 00:00:00" -print)

Find commands

find . -type f -not -path "*/target/*" -name "log4j2.xml" -exec grep -H '<Async name="' {} \;
0

On Solaris (and likely other old Unixes)

ggrep -r "$yourtext" $directory

access_granted
  • 1,807
  • 20
  • 25