5

I have a script like this:

#!/bin/sh

echo "hello"
echo "goodbye"
exit 1

When I run it on its own, I get the failed exit code as I expect.

$ ./fail.sh
hello
goodbye
$ echo $?
1

However, when I run it through grep -v, the exit status changes to success:

$ ./fail.sh | grep -v hello
goodbye
$ echo $?
0

Is there a way to pipe a command's output into grep -v and still have the status code be properly propagated? Of course in the real world the point of this would be to filter the output of a noisy command, while still detecting if the command failed.

limp_chimp
  • 13,475
  • 17
  • 66
  • 105
  • Don't know if you can do that directly, but you can always redirect output to a tmp file, which allows you to grab the return of the initial grep and then apply your second grep to the output file contents. – TheGreatContini May 09 '16 at 23:28
  • limp_chimp have you had a chance to test the suggestions that @hek2mgl or I posted for you? If it works for you be sure to post feedback and consider accepting the answer. – John Mark Mitchell May 10 '16 at 15:55
  • Not sure why this was missed, but `grep -v` is a *really bad* command to use for this example. That is because although `grep` follows convention with regards to exit status, `grep -v` does not necessarily. In other words, using `-v` inverts the match but *not* the exit code. My experience is that `grep -v` returns success when it returns non-matches. If there are no non-matches to return it returns fail. If nothing matches it returns all the lines that *do not match* and success. – ingyhere Apr 10 '17 at 21:39
  • ^ Using `grep (GNU grep) 2.24` (`Copyright (C) 2016 Free Software Foundation, Inc.`) and here is a [citation](https://lists.gnu.org/archive/html/bug-gnu-utils/2004-07/msg00124.html) – ingyhere Apr 10 '17 at 21:44
  • Use ***-L*** to get exit code for matching line that **should not be found in file**: https://unix.stackexchange.com/a/534581/43233 . For example: `grep -L 'match' file | grep .` – Noam Manos Jun 15 '23 at 23:00

2 Answers2

8

Leveraging set -o pipefail, the following should work:

( set -o pipefail; ./fail.sh | grep -v hello )

You can then test the value in $?:

( set -o pipefail; ./fail.sh | grep -v hello ); if [[ "$?" -eq "1" ]]; then echo success; else echo bummer; fi 

It should output:

goodbye
success

What is happening and why does this work?

As noted in the OP, pipelines normally only return a failure (non-zero return code) if the last command errors. Using set -o pipefail causes a pipeline of commands to produce a failure return code if any command in the pipeline errors. The failure return code that the pipeline passes is the return code of the last failed command.

You can test this by updating your script to:

#!/bin/sh

echo "hello"
echo "goodbye"
exit 5

then run the following:

( set -o pipefail; ./fail.sh | grep -v hello ); echo $?

It should output:

goodbye
5

The above illustrates that set -o pipefail is not just exiting with a non-zero return code but that it is relaying the last non-zero return code verbatim.

John Mark Mitchell
  • 4,522
  • 4
  • 28
  • 29
  • 2
    Sadly `pipefail` is only available on Bash, so your code fails where `sh` is Dash (e.g. Ubuntu) or where you are explicitly running your code under some particular non-Bash shell. Perhaps there's a more POSIX-compliant solution? – Lightness Races in Orbit Nov 21 '17 at 18:41
  • When answering the question, I had taken into account that the question had the BASH tag, and as such, I expected that the OP was using Bash. I am open to other suggestions that are more POSIX happy. – John Mark Mitchell Apr 24 '20 at 20:06
4

bash will basically just give you the exit code of the last process in the pipe but you can use the bash specific $PIPESTATUS variable:

./fail.sh | grep -v hello
if [ ! "${PIPESTATUS[0]}" ] ; then
    echo "command has failed"
fi
hek2mgl
  • 152,036
  • 28
  • 249
  • 266