6

I'm not that good with bash, but I'm trying to create a script to kill some java processes:

/usr/ucb/ps -auxww    \
  | grep 'XUnit'      \
  | grep -v 'grep'    \
  | cut -c -2000      \
  | awk '{print $2;}' \
  | xargs kill

cut is used here because awk can fail with excessively long lines (see references to LINE_MAX limit in the POSIX specification for awk).

The problem occurs when there are no such processes - xargs tries to run kill with no arguments, resulting in an error.

My xargs does not accept -r or --no-run-if-empty args, as suggested in answers to a related question that doesn't specify POSIX compliance.

Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
awfun
  • 2,316
  • 4
  • 31
  • 52
  • 2
    It is already answered here: [http://stackoverflow.com/questions/8296710/ignore-empty-result-for-xargs](http://stackoverflow.com/questions/8296710/ignore-empty-result-for-xargs) – scope Dec 23 '15 at 17:00
  • BTW -- if you have `pkill`, you'd be better off using that to do this in a single command instead of trying to filter `ps` output through a huge pipeline. Also, `awk` can do itself the work of `grep` and `cut`, so even if you were going to stick with a pipeline, there's no reason for it to be so complex. – Charles Duffy Dec 23 '15 at 17:02
  • ie: `ps auxww | awk '(/XUnit/ && ! /awk/) { print $2 }'`. But, as above, best practice is not to use `ps` at all. – Charles Duffy Dec 23 '15 at 17:06
  • Hmm. Reading the POSIX spec for `awk`, it *is* allowed to be to subject to `LINE_MAX`, typically 2k. Just learned something new there -- though I'm curious which implementation you're using that actually enforces that limit. – Charles Duffy Dec 23 '15 at 17:07
  • @CharlesDuffy , how can I use pkill which greps by process arguments? I don't like this long command either, but as I mentioned above I rarely use bash - this is Frankenstein from commands I succeed to find in Google – awfun Dec 23 '15 at 17:12
  • `pkill -f` looks at argument lists. – Charles Duffy Dec 23 '15 at 17:16
  • xargs I have does not accept -r or --no-run-if-empty args – awfun Dec 23 '15 at 17:16
  • Yah, if you want a POSIX solution, that makes it more interesting (and distinguishes your question from the one of which it's otherwise duplicative), but you'd need to edit your question to clarify that (ie. point to the other version, and specify that you need a version that works with baseline POSIX xargs instead). – Charles Duffy Dec 23 '15 at 17:18
  • pkill -f 'XUnit' does not kill the processes showed by /usr/ucb/ps -auxww | grep 'XUnit' | grep -v 'grep' – awfun Dec 23 '15 at 17:21
  • BTW, when you said "not that good with bash" -- is your shell *actually* bash, or is it only guaranteed to be POSIX sh compliant (or, worse, Bourne)? – Charles Duffy Dec 23 '15 at 17:30
  • As an aside: a neat way to avoid the `grep -v 'grep'` is `grep 'XUni[t]'`. – Benjamin W. Dec 23 '15 at 17:37
  • 1
    @BenjaminW., yes! I think the POSIXly parsimonious approach is `ps -e -o pid= -o args= | awk '/[X]Unit/{ print $1 }'`. Maybe throw a `cut -c -$(getconf LINE_MAX)` before the _awk_ if you are concerned about that limit. – pilcrow Dec 23 '15 at 17:48
  • @pilcrow, heh -- I'd been under the assumption that `getconf` was a Linuxism, but you're right, it's POSIX-specified. – Charles Duffy Dec 23 '15 at 17:56
  • @CharlesDuffy, I just checked the same thing to confirm before commenting, and to confirm the variable name of LINE_MAX. :) – pilcrow Dec 23 '15 at 17:57

2 Answers2

8

Addressing specifically the question at hand, ignoring whether the approach at hand is actually an appropriate way to kill a process:

xargs sh -c '[ $# -gt 0 ] && exec "$0" "$@"' kill

This approach has xargs launch a shell which looks at the length of its argument list (which will be 0 if only kill is passed, as arguments following -c 'script' start with $0, not included in the $# count); that shell only runs the command given if at least one argument is given.

Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • Why not just put a `handle_empty` function in the pipeline? If there is an empty argument, exit with 0 and echo 'there was a blank line'... see: https://gist.github.com/ORESoftware/bb8f97354ff38ee4a0a1dd1589af571a – Alexander Mills May 28 '19 at 19:44
  • or use the -r option - https://unix.stackexchange.com/questions/521595/if-there-is-empty-stdin-or-whatever-tell-xargs-not-to-care/521599#521599 – Alexander Mills May 28 '19 at 19:55
  • 1
    `-r` isn't portable -- as the answer you link indicates. Exported functions aren't portable to non-bash shells, either. The OP is very clear that the question is asking specifically for a solution that will work on POSIX-baseline platforms (and that we already have plenty of near-duplicates that lack that requirement). – Charles Duffy May 28 '19 at 20:45
  • FWIW, I find it absolutely hilarious – even dangerous – that POSIX xargs behaves this way. – Torsten Bronger Jan 04 '21 at 08:07
  • @TorstenBronger, ...honestly, the lack of some `-r` equivalent is probably the _least_ dangerous of POSIX `xargs`' design misfeatures; it's a lot less unsafe than the shell-like (but not quite shell-compatible) input parsing, or permitting implementations to perform substring expansions (thus encouraging `xargs -I{} sh -c '...{}...'` use modes with the attendant shell injection risk). – Charles Duffy Jan 04 '21 at 14:35
  • At least, in those cases you know that you enter the danger zone. However, I had an almost trivial xargs call that went amok because the behaviour is totally counter-intuitive. – Torsten Bronger Jan 06 '21 at 22:57
  • Note that if you use `set -e` and the `[ $# -gt 0 ]` returns non-zero, your script will stop running. To work around this, you can use `if ... fi`, i.e., `xargs sh -c 'if [ "$#" -gt 0 ]; then exec "$0" "$@"; fi' kill`. – Shane Bishop Sep 02 '22 at 14:49
  • @ShaneBishop, `[ $# -gt 0 ]` is considered "checked" when there's a `&&` after it, so it doesn't cause the script to abort. (`set -e` has a ton of corner cases and rules like this, and they change between shells and shell versions, which is part of why I [strongly advise against its use](https://mywiki.wooledge.org/BashFAQ/105#Exercises); but that particular behavior is effectively universal). – Charles Duffy Sep 02 '22 at 14:57
0

Yes normally, there's the -r option to xargs, see:

https://unix.stackexchange.com/questions/521595/if-there-is-empty-stdin-or-whatever-tell-xargs-not-to-care/521599#521599

Otherwise you can create a bash function and put in the pipeline before xargs:

handle_empty(){
  while read line; do
    if test -z "$line"; then
        echo 'There was an empty line, exiting.' > /dev/stderr
        exit 0
    fi
    echo "$line"
  done
}
    
export -f handle_empty

and use it like so:

docker volume ls -qf dangling=true | handle_empty | xargs docker volume rm 

for more info see: https://gist.github.com/ORESoftware/bb8f97354ff38ee4a0a1dd1589af571a

Jens
  • 69,818
  • 15
  • 125
  • 179
Alexander Mills
  • 90,741
  • 139
  • 482
  • 817
  • 2
    From the very last line of the question: *My xargs does not accept `-r` or `--no-run-if-empty` args, as suggested in answers to a related question that doesn't specify POSIX compliance.* Similarly, as they're specifying strict POSIX compliance, exported functions are out too. – Charles Duffy May 28 '19 at 20:47
  • I'm not sure that `handle_empty` actually does what you're looking for here. All parts of a pipeline run at the same time -- the `handle_empty` function exiting doesn't mean that `xargs` doesn't run (or prevent `xargs` from being invoked with no input; for that matter, if there's no input at all, the `while read` loop will never be entered in the first place), it just ensures that xargs can't see any content _past_ the first blank line in input. – Charles Duffy Feb 21 '21 at 18:12