1

I'm trying to do a quick bulk search and replace for some common strings across a large number of source files. I'm doing this on a Mac, so I know it sometimes behaves a little differently. To start, I'm trying to get the list of all files recursively that contain the string I want to change:

grep -rlF 'oldtext' .

When the command runs, it appears to wait for stdin after displaying a list of files that contain the string I'm looking for. Control doesn't return to a prompt, and I can type and press enter with no real reaction from the terminal aside from the terminal showing what I type as I type it.

Am I calling grep incorrectly, or is there something about it running on a Mac that I'm not accounting for?

chaosTechnician
  • 1,640
  • 4
  • 18
  • 31
  • 1
    See if answers in this post help - https://stackoverflow.com/q/9704020/5291015 – Inian Jun 10 '20 at 18:04
  • Use `grep` to **g/re/p** within files and use `find` to **find** files. Don't use `grep` to **find** files - there is a big clue in the names of the tools what they are best used for. You're also playing with fire using `grep` to find `oldtext` as a string (`-F`) but then using `sed` to replace `oldtext` as a regexp - that can easily produce unexpected/undesirable results. If you [edit] your question to provide concise, testable sample input and expected output then we can help you do whatever it is you're trying to do the right way. – Ed Morton Jun 10 '20 at 18:05
  • or try using an explicit placeholder for `xargs` like `xargs -I {} sed -i '' 's/oldtext/newtext/g' {}` – Inian Jun 10 '20 at 18:05
  • 1
    How do you know it's polling on stdin? When you type `oldtext` does it echo that back? – oguz ismail Jun 10 '20 at 18:06
  • I simplified and removed some of the wider context since it was pulling in discussion that wasn't quite related to the issue I was trying to ask about. Though, it seems the actual issue was that I wasn't being patient enough while the command was running across the large number of files. – chaosTechnician Jun 10 '20 at 23:16

2 Answers2

0

I'm guessing you want to replace a string in a file but not update the timestamp, etc. of files that don't contain that string. I'd do that as something like this if neither text can contain backslashes

tmp=$(mktemp) || exit
while IFS= read -r file; do
    awk -v old='oldtext' new='newtext' '
        s=index($0,old) { $0=substr($0,s-1) new substr($0,s+length(old)); f=1 }
        { print }
        END { exit !f }
    ' "$file" > "$tmp" &&
    mv "$tmp" "$file"
done < <(find . -type f)

That will work with POSIX tools as long as your file names don't contain newlines.

If your old or new text can contain a backslash it'd require a small tweak to set the awk variables from ENVIRON[] or ARGV[] instead of with -v (see How do I use shell variables in an awk script?).

The above is doing string operations, which sed doesn't support. See Is it possible to escape regex metacharacters reliably with sed for what you'd have to do to get sed to act as if it were using strings.

Ed Morton
  • 188,023
  • 17
  • 78
  • 185
0

After reading oguz ismail's comment, I tried a few other things and realized that the command was not waiting for stdin. Instead, it was still running without finding any further results for nearly a minute. After this significant delay, further files were listed as more files were found. After a few other similar delays, the command completed as I was originally expecting.

There were just so many source files that I thought it was waiting for stdin because the MacOS terminal allows you to continue to type while commands are running.

chaosTechnician
  • 1,640
  • 4
  • 18
  • 31
  • Check out [ripgrep](https://github.com/BurntSushi/ripgrep) if you wish to speed up search, especially if you have multiple cores and you wish to avoid searching files in gitignore, etc – Sundeep Jun 11 '20 at 04:46