0

I want to replace a string which occurs in several files. For a single file I am able to do using unix command :

sed 's/error("/printf( "ERROR : /g' all_reset_test.c > new_reset/all_reset_test.c 

which replaces all 'error("' with 'printf( "ERROR : ' in this file.

But I have over 100 files for which I need to do this. I am looking for how to run this command for all files at once in either a perl or a python script.

Dave.Gugg
  • 6,561
  • 3
  • 24
  • 43
Pushpe
  • 15
  • 2

5 Answers5

1

For example, if your files have the extension .txt, you can use this:

%> perl -pi -e 's/error("/printf( "ERROR : /' *.txt
Miguel Prz
  • 13,718
  • 29
  • 42
1

You can use sed's option -i

Quoting from sed's manpage:

-i[SUFFIX], --in-place[=SUFFIX]

      edit files in place (makes backup if extension supplied)

If you omit the SUFFIX sed will not create a backup before modifying the file.

In your case this

sed -i 's/error("/printf( "ERROR : /g' *.c

should do the job (without pyhton, perl, or a bash loop).

branch14
  • 1,253
  • 11
  • 24
0

You can use a simple bash for loop to apply this logic to all of your files:

for f in $(ls *.c); do 
   sed 's/error("/printf( "ERROR : /g' ${f} > new_reset/${f}
done

The $(ls *.c) portion should be replaced by whatever ls command will select the files that you want to apply the sed command to.

Hunter McMillen
  • 59,865
  • 24
  • 119
  • 170
  • 1
    This may have produced the expected output for some specific set of input files but it's wrong in general in a couple of ways that will cause surprising failures given different file names and directory contents. Never use `for` on the output of `ls` and always quote your variables unless you fully understand the consequences and have a specific goal in mind. The correct syntax for this approach would be `for f in *.c; do sed '...' "$f" > "new_reset/$f"; done` but see @branch14's answer for the better approach. – Ed Morton Aug 20 '14 at 21:00
  • -1 for adding a superfluous and buggy command substituion instead of sticking to plain globbing. – user2719058 Aug 21 '14 at 20:58
0

Just wrap your sed command in a for cycle:

for file in $(cat file_list)
do
    sed 's/error("/printf( "ERROR : /g' $file > new_reset/$file
done

Of course, the list of files to edit can be obtained in multiple ways:

for file in $(ls *.c)  # if the files are in the same folder
do
    sed 's/error("/printf( "ERROR : /g' $file > new_reset/$file
done

Or

for file in $(find -type f -name '*.c')
do
    sed 's/error("/printf( "ERROR : /g' $file > new_reset/$file
done
Roberto Reale
  • 4,247
  • 1
  • 17
  • 21
0

You can iterate over files in the shell e.g. this finds all the .txt files in the subdirectories below the current directory. Each file is available in $f

for f in $(find . -name \*.txt); do
   # run my sed script for $f
done

There are numerous options for iterating over a set of files using bash. See here for some options. If you filenames have spaces, you will have to be careful, and this SO question/answer details some options.

Community
  • 1
  • 1
Brian Agnew
  • 268,207
  • 37
  • 334
  • 440