I'm late to the party, but there is one more solution that wasn't covered here: user-defined functions. Putting multiple instructions on one line is unwieldy, and can be hard to read/maintain. The for loop above avoids that, but there is the possibility of exceeding the command line length.
Here's another way (untested).
function processFiles {
ls -aldF "$@"
rm -rdf "$@"
}
export -f processFiles
find . -name "filename including space"` -print0 \
| xargs -0 bash -c processFiles dummyArg > log.txt
This is pretty straightforward except for the "dummyArg" which gave me plenty of grief. When running bash in this way, the arguments are read into
"$0" "$1" "$2" ....
instead of the expected
"$1" "$2" "$3" ....
Since processFiles{} is expecting the first argument to be "$1", we have to insert a dummy value into "$0".
Footnontes:
- I am using some elements of bash syntax (e.g. "export -f"), but I believe this will adapt to other shells.
- The first time I tried this, I didn't add a dummy argument. Instead I added "$0" to the argument lines inside my function ( e.g. ls -aldf "$0" "$@" ). Bad idea.
Aside from stylistic issues, it breaks when the "find" command returns nothing. In that case, $0 is set to "bash", Using the dummy argument instead avoids all of this.