0

Let's say I have a bunch of *.tar.gz files located in a hierarchy of folders. What would be a good way to find those files, and then execute multiple commands on it.

I know if I just need to execute one command on the target file, I can use something like this:

$ find . -name "*.tar.gz" -exec tar xvzf {} \; 

But what if I need to execute multiple commands on the target file? Must I write a bash script here, or is there any simpler way?

Samples of commands that need to be executed a A.tar.gz file:

$ tar xvzf A.tar.gz   # assume it untars to folder logs
$ mv logs logs_A
$ rm A.tar.gz
artm
  • 17,291
  • 6
  • 38
  • 54

2 Answers2

2

Here's what works for me (thanks to Etan Reisner suggestions)

    #!/bin/bash    # the target folder (to search for tar.gz files) is parsed from command line
    find $1 -name "*.tar.gz" -print0 | while IFS= read -r -d '' file; do    # this does the magic of getting each tar.gz file and assign to shell variable `file`
        echo $file                        # then we can do everything with the `file` variable
        tar xvzf $file
        # mv untar_folder $file.suffix    # untar_folder is the name of folder after untar
        rm $file
    done

As suggested, the array way is unsafe if file name contained space(s), and also doesn't seem to work properly in this case.

Community
  • 1
  • 1
artm
  • 17,291
  • 6
  • 38
  • 54
1

Writing a shell script is probably easiest. Take a look at sh for loops. You could use the output of a find command in an array, and then loop over that array to perform a set of commands on each element.

For example,

arr=( $(find . -name "*.tar.gz" -print0) )
for i in "${arr[@]}"; do
    # $i now holds each of the filenames output by find
    tar xvzf $i
    mv $i $i.suffix
    rm $i
    # etc., etc.
done
Community
  • 1
  • 1
Langston
  • 1,083
  • 10
  • 26
  • +1 for the links and a sample. But it's not quite working for me yet. Here's the simplest case when I just need to print out the name of all files, using the arr syntax you recommended. It just prints out the first file only (ie it doesn't really go through all the items in the array). Did I miss something? Have you tried it on your PC? `#!/bin/bash` `arr=( $(find . -type f) )` `for i in $arr ; do` `echo $i` `done` – artm Jul 15 '15 at 05:17
  • 1
    Collecting the results of `find` into an array like this is **not** safe for file names that can contain whitespace or other shell glob/metacharacters. – Etan Reisner Jul 15 '15 at 05:24
  • 1
    `$arr` does not expand to the array contents it expands to the first value in the array (it is the same as `${arr[0]}`). You need `"${arr[@]}"` to get all the array values (but with this code the quotes there don't matter as you will have already blown up any files with spaces/etc. as per my first comment). See [Bash FAQ 001](http://mywiki.wooledge.org/BashFAQ/001) for ways to safely read lines of data in the shell, specifically the example that uses `find`'s `-print0` argument. – Etan Reisner Jul 15 '15 at 05:26