299

The following command is correctly changing the contents of 2 files.

sed -i 's/abc/xyz/g' xaa1 xab1 

But what I need to do is to change several such files dynamically and I do not know the file names. I want to write a command that will read all the files from current directory starting with xa* and sed should change the file contents.

Birei
  • 35,723
  • 2
  • 77
  • 82
shantanuo
  • 31,689
  • 78
  • 245
  • 403
  • 86
    You mean `sed -i 's/abc/xyz/g' xa*` ? – Paul R May 04 '12 at 09:13
  • 6
    The answers here don't suffice. See https://unix.stackexchange.com/questions/112023/how-can-i-replace-a-string-in-a-files?newreg=63655f028ac04f11bbf861c7bba8db9f – Isaac Jan 16 '18 at 03:47
  • Here's another answer on updating many files at once: https://unix.stackexchange.com/questions/29268/how-to-search-and-replace-text-in-all-php-files-in-a-directory-and-its-subdirec – ᴍᴇʜᴏᴠ Dec 17 '20 at 11:50
  • 2
    I did @PaulR solution and it worked, but what I don't understand is all these complicated other answers! What is missing from your solution? – Soheil Rahsaz Oct 11 '21 at 10:06
  • 1
    @SoheilRahsaz sometimes when there is too much files the shell will complain about the argument list length as demonstrated by the top answer – Abdessabour Mtk Jun 11 '22 at 20:15
  • `sed` can take multiple files. All these complicated answers are just concerning the command length limit. – Константин Ван Mar 19 '23 at 12:04

11 Answers11

260

I'm surprised nobody has mentioned the -exec argument to find, which is intended for this type of use-case, although it will start a process for each matching file name:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} \;

Alternatively, one could use xargs, which will invoke fewer processes:

find . -type f -name 'xa*' | xargs sed -i 's/asd/dsg/g'

Or more simply use the + exec variant instead of ; in find to allow find to provide more than one file per subprocess call:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} +
ealfonso
  • 6,622
  • 5
  • 39
  • 67
  • 16
    I had to modify the command in this answer like so: `find ./ -type f -name 'xa*' -exec sed -i '' 's/asd/dsg/g' {} \;` that's the location for the find command `./` and a pair of single quotes after `-i` for OSX. – shelbydz Mar 17 '17 at 13:30
  • The find command works as it is supplied by ealfonso, `./` is equal to `.` and after `-i` has only the backupsuffix parameter. – uhausbrand Oct 05 '18 at 08:06
  • 2
    The `-exec` option of find along with `{} +` is sufficient to solve the problem as stated, and should be fine for most requirements. But `xargs` is a better choice in general because it also allows parallel processing with the `-p` option. When your glob expansion is large enough to overflow your command line length, you are likely to also benefit from a speedup over a sequential run. – Amit Naidu Jun 15 '20 at 14:02
  • I don't know why none of these options work for me as I have used them successfully on *nix machines. I'm in terminal on a Mac though now... – skittlebiz Jan 18 '23 at 22:50
167

Better yet:

for i in xa*; do
    sed -i 's/asd/dfg/g' $i
done

because nobody knows how many files are there, and it's easy to break command line limits.

Here's what happens when there are too many files:

# grep -c aaa *
-bash: /bin/grep: Argument list too long
# for i in *; do grep -c aaa $i; done
0
... (output skipped)
#
lenik
  • 23,228
  • 4
  • 34
  • 43
  • 23
    If there are that many files, you'll break the command line limit in the `for` command. To protect yourself from that, you'd have to use `find ... | xargs ...` – glenn jackman May 04 '12 at 15:58
  • @glenn jackman: i've just tried `for i in `find .`; do echo $i; done` with 600k files, and it works, could you please elaborate, what is the limit for the `for` command and how can i break it? – lenik May 04 '12 at 19:41
  • maybe I'm wrong. What happens if you `echo [pattern_that_expands_to_600k_files]`? – glenn jackman May 04 '12 at 19:46
  • the whole point of using `for` is to be able to **iterate** through the list of files and never have to deal with 600k files at the same time, `$i` is always a single file name. please, correct me if i'm wrong, but as far as i know, there's no limit imposed on the `for` statement. – lenik May 04 '12 at 21:55
  • 1
    I don't know the implementation, but the "xa*" pattern does have to get expanded at some point. Does the shell do the expansion differently for `for` than it does for `echo` or `grep`? – glenn jackman May 04 '12 at 22:01
  • of course it does the expansion differently, that's why i said it's better to use `for` statement to avoid breaking command line length limitations in case of having too many files. – lenik May 04 '12 at 22:05
  • 4
    see the updated answer. if you need more info, please, ask an official question, so people could help you. – lenik May 05 '12 at 01:18
  • 7
    In the sed command, you need to use `"$i"` instead of `$i` to avoid word splitting on filenames with spaces. Otherwise this is very nice. – Wildcard Nov 25 '15 at 23:28
  • 6
    Regarding the list, I believe the difference is that `for` is part of the language syntax, not even just a builtin. For `sed -i 's/old/new' *`, the expansion of `*` must ALL be passed as an arglist to sed, and I'm fairly sure this has to happen before the `sed` process can even be started. Using the `for` loop, the full arglist (the expansion of `*`) never gets passed as a command, only stored in the shell memory and iterated through. I don't have any reference for this at all though, it just seems probable that is the difference. (I'd love to hear from someone more knowledgeable...) – Wildcard Nov 26 '15 at 01:18
92

You could use grep and sed together. This allows you to search subdirectories recursively.

Linux: grep -r -l <old> * | xargs sed -i 's/<old>/<new>/g'
OS X: grep -r -l <old> * | xargs sed -i '' 's/<old>/<new>/g'

For grep:
    -r recursively searches subdirectories 
    -l prints file names that contain matches
For sed:
    -i extension (Note: An argument needs to be provided on OS X)
Raj Shenoy
  • 1,270
  • 10
  • 19
37

Those commands won't work in the default sed that comes with Mac OS X.

From man 1 sed:

-i extension
             Edit files in-place, saving backups with the specified
             extension.  If a zero-length extension is given, no backup 
             will be saved.  It is not recommended to give a zero-length
             extension when in-place editing files, as you risk corruption
             or partial content in situations where disk space is exhausted, etc.

Tried

sed -i '.bak' 's/old/new/g' logfile*

and

for i in logfile*; do sed -i '.bak' 's/old/new/g' $i; done

Both work fine.

slm
  • 15,396
  • 12
  • 109
  • 124
funroll
  • 35,925
  • 7
  • 54
  • 59
  • 2
    @sumek Here's an example terminal session on OS X that shows sed replacing all occurrences: [GitHub Gist](https://gist.github.com/funroll/5504098) – funroll May 02 '13 at 18:11
  • I used this to replace two different lines in all my website config files with a one-liner below. sed -i.bak "s/supercache_proxy_config/proxy_includes\/supercache_config/g; s/basic_proxy_config/proxy_include\/basic_proxy_config/g" sites-available/* Don't forget to delete the *.bak files when you are done for file system hygiene sake. – Josiah Jun 01 '15 at 19:14
33

@PaulR posted this as a comment, but people should view it as an answer (and this answer works best for my needs):

sed -i 's/abc/xyz/g' xa*

This will work for a moderate amount of files, probably on the order of tens, but probably not on the order of millions.

Community
  • 1
  • 1
palswim
  • 11,856
  • 6
  • 53
  • 77
  • Assume you have forward-slashes in your replacements. Another example with filepaths `sed -i 's|auth-user-pass nordvpn.txt|auth-user-pass /etc/openvpn/nordvpn.txt|g' *.ovpn`. – Léo Léopold Hertz 준영 Mar 05 '17 at 20:59
15

Another more versatile way is to use find:

sed -i 's/asd/dsg/g' $(find . -type f -name 'xa*')
dkinzer
  • 32,179
  • 12
  • 66
  • 85
  • 1
    the output of that find command gets expanded, so this doesn't address the issue. Instead you should use -exec – ealfonso Jun 08 '15 at 10:28
  • @erjoalgo this works because the sed command can handle multiple input files. Expansion of the find command is exactly was is needed to make it work. – dkinzer Jun 08 '15 at 15:14
  • it works as long as the number of files doesn't push into command line limits. – ealfonso Jun 08 '15 at 17:07
  • That limit is dependent only on the memory resources available to the machine and it's exactly the same as the limit for exec. – dkinzer Jun 08 '15 at 17:49
  • 4
    That is simply not true. In your command above, the $(find . ...) gets expanded into a single command, which could be very long if there are many matching files. If it is too long (for example in my system the limit is around 2097152 characters) you might get an error: "Argument list too long" and the command will fail. Please google this error to get some background on this. – ealfonso Jun 08 '15 at 17:59
  • OK, I guess I've just never expanded a search that long before. – dkinzer Jun 08 '15 at 18:05
5

I'm using find for similar task. It is quite simple: you have to pass it as an argument for sed like this:

sed -i 's/EXPRESSION/REPLACEMENT/g' `find -name "FILE.REGEX"`

This way you don't have to write complex loops, and it is simple to see, which files you are going to change, just run find before you run sed.

Bluesboy
  • 51
  • 2
  • 2
  • 2
    This is exactly the same as [@dkinzer’s answer](https://stackoverflow.com/a/27849922/2948889). – Mr. Tao Nov 12 '18 at 16:40
1

u can make

'xxxx' text u search and will replace it with 'yyyy'

grep -Rn '**xxxx**' /path | awk -F: '{print $1}' | xargs sed -i 's/**xxxx**/**yyyy**/'
1

There's some good answers above. I thought I'd throw in one more that is succinct and parallelizable, using GNU parallel, which I often prefer to xargs:

parallel sed -i 's/abc/xyz/g' {} ::: xa*

Combine this with the -j N option to run N jobs in parallel.

Paul M.
  • 608
  • 3
  • 11
0

If you are able to run a script, here is what I did for a similar situation:

Using a dictionary/hashMap (associative array) and variables for the sed command, we can loop through the array to replace several strings. Including a wildcard in the name_pattern will allow to replace in-place in files with a pattern (this could be something like name_pattern='File*.txt' ) in a specific directory (source_dir). All the changes are written in the logfile in the destin_dir

#!/bin/bash
source_dir=source_path
destin_dir=destin_path
logfile='sedOutput.txt'
name_pattern='File.txt'

echo "--Begin $(date)--" | tee -a $destin_dir/$logfile
echo "Source_DIR=$source_dir destin_DIR=$destin_dir "

declare -A pairs=( 
    ['WHAT1']='FOR1'
    ['OTHER_string_to replace']='string replaced'
)

for i in "${!pairs[@]}"; do
    j=${pairs[$i]}
    echo "[$i]=$j"
    replace_what=$i
    replace_for=$j
    echo " "
    echo "Replace: $replace_what for: $replace_for"
    find $source_dir -name $name_pattern | xargs sed -i "s/$replace_what/$replace_for/g" 
    find $source_dir -name $name_pattern | xargs -I{} grep -n "$replace_for" {} /dev/null | tee -a $destin_dir/$logfile
done

echo " "
echo "----End $(date)---" | tee -a $destin_dir/$logfile

First, the pairs array is declared, each pair is a replacement string, then WHAT1 will be replaced for FOR1 and OTHER_string_to replace will be replaced for string replaced in the file File.txt. In the loop the array is read, the first member of the pair is retrieved as replace_what=$i and the second as replace_for=$j. The find command searches in the directory the filename (that may contain a wildcard) and the sed -i command replaces in the same file(s) what was previously defined. Finally I added a grep redirected to the logfile to log the changes made in the file(s).

This worked for me in GNU Bash 4.3 sed 4.2.2 and based upon VasyaNovikov's answer for Loop over tuples in bash.

Lejuanjowski
  • 139
  • 1
  • 2
  • 6
0

The Silver Searcher Solution

I'm adding another option for those people who don't know about the amazing tool called The Silver Searcher (command line tool is ag).

Note: You can use grep and other tools to do the same thing here, but The Silver Searcher is fantastic :)

TLDR

ag -l 'abc' | xargs sed -i 's/abc/xyz/g'

Install The Silver Searcher

sudo apt install silversearcher-ag                # Debian / Ubuntu
sudo pacman -S the_silver_searcher                # Arch / EndeavourOS
sudo yum install epel-release the_silver_searcher # RHEL / CentOS

Demo Files

Paste the following into your terminal to create some demonstration files:

mkdir /tmp/food
cd /tmp/food
content="Everybody loves to abc this food!"
echo "$content" > ./milk
echo "$content" > ./bread
mkdir ./fastfood
echo "$content" > ./fastfood/pizza
echo "$content" > ./fastfood/burger
mkdir ./fruit
echo "$content" > ./fruit/apple
echo "$content" > ./fruit/apricot

Using 'ag'

The following ag command will recursively find all the files that contain the string 'abc'. It ignores the .git directory, .gitignore files, and other ignore files:

$ ag 'abc'
milk
1:Everybody loves to abc this food!

bread
1:Everybody loves to abc this food!

fastfood/burger
1:Everybody loves to abc this food!

fastfood/pizza
1:Everybody loves to abc this food!

fruit/apple
1:Everybody loves to abc this food!

fruit/apricot
1:Everybody loves to abc this food!

To just list the files that contain the string 'abc', use the -l switch:

$ ag -l 'abc'
bread
fastfood/burger
fastfood/pizza
fruit/apricot
milk
fruit/apple

Changing Multiple Files

Finally, using xargs and sed, we can replace the 'abc' string with another string:

ag -l 'abc' | xargs sed -i 's/abc/eat/g'

In the above command, ag is listing all the files that contain the string 'abc'. The xargs command is splitting the file names and piping them individually into the sed command.

Grant Carthew
  • 177
  • 1
  • 10