-1

Am trying to write a script that finds the files that are older than 10 hours from the sub-directories that are in the "HS_client_list". And send the Output to a file "find.log".

#!/bin/bash

while IFS= read -r line; do

  echo Executing cd /moveit/$line
  cd /moveit/$line

  #Find files less than 600 minutes old.
  find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log


done < HS_client_list

However, the script is able to cd to the folders from HS_client_list(this file contents the name of the subdirectories) but, the find command (find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log) is not working. The Output file is empty. But when I run find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log as a command it works and from the script it doesn't.

Mathieu
  • 8,840
  • 7
  • 32
  • 45
Prabhu U
  • 81
  • 1
  • 1
  • 4
  • 1
    `>` you know that you _overwrite_ the file each loop? `as a command it works` does it find any files when using _the last_ entry from HS_client_list? And, check your script with shellcheck. – KamilCuk Mar 18 '22 at 08:36

1 Answers1

1

You are overwriting the file in each iteration.

You can use xargs to perform find on multiple directories; but you have to use an alternate delimiter to avoid having xargs populate the {} in the -execdir command.

sed 's%^%/moveit/%' HS_client_list | 
xargs -I '<>' find '<>' -type f -iname "*.enc" -mmin +600 -execdir basename {} \; > /home/infa91punv/find.log

The xargs ls did not seem to perform any useful functionality, so I took it out. Generally, don't use ls in scripts.

With GNU find, you could avoid the call to an external utility, and use the -printf predicate to print just the part of the path name that you care about.

For added efficiency, you could invoke a shell to collect the arguments:

sed 's%^%/moveit/%' HS_client_list | 
xargs sh -c 'find "$@" -type f -iname "*.enc" -mmin +600 -execdir basename {} \;' _ >/home/infa91punv/find.log

This will run as many directories as possible in a single find invocation.

If you want to keep your loop, the solution is to put the redirection after done. I would still factor out the cd, and take care to quote the variable interpolation.

while IFS= read -r line; do
  find /moveit/"$line" -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' 
done < HS_client_list >/home/infa91punv/find.log
tripleee
  • 175,061
  • 34
  • 275
  • 318
  • I'm not entirely sure why you run `-execdir basename {} \;` -- in my brief experiments, you could replace that with simply `-execdir echo {} \;` or perhaps more efficiently and elegantly `-execdir printf '%s\n' {} +` – tripleee Mar 18 '22 at 08:47
  • 1
    It looks to me like the `xargs ls` part is not merely unnecessary, but actively problematic, since the `ls` will not be running in the right directory. `-execdir` will run `basename` (or `echo` or whatever) in the file's directory, but that doesn't apply to `xargs` or `ls`. – Gordon Davisson Mar 18 '22 at 10:03
  • Also `find` can take more than one directory at a time to search, so in theory you could do this (untested) `find $(sed 's%^%/moveit/%' HS_Client_List) -type f ... `. – Anthony C Howe Mar 18 '22 at 13:54
  • @AnthonyCHowe The second `xargs` example already does exactly that, but avoids the problems with using a command substitution, which won't work if the paths contain unquoted whitespace or shell metacharacters. – tripleee Mar 18 '22 at 13:56