1

sorry if this is a duplicate, i had no idea what to search for...

my use case is more complex, but can be narrowed down to the following problem:

i want to run a bash script, which invokes all sorts of binaries, for example: grep. i want to assert that the binaries were invoked with the correct arguments. these assertions should be part of automated testing, i don't want to manually start checking things. this should go into ci cycle.

is there some standard way of doing this?

if not, i thought of moving all the binaries i wish to assert, replace them with a spy which first logs the arguments and then invokes the original binary and finally remove itself and return the original binary.

is this feasible? is there a better approach to the problem?

suvayu
  • 4,271
  • 2
  • 29
  • 35
nathan g
  • 858
  • 9
  • 17
  • Wouldn't `set -x` be enough? – Biffen Jun 16 '15 at 10:08
  • If you don't want to change the scripts, run them like this `bash -x ./scriptname` – Mark Setchell Jun 16 '15 at 10:09
  • i can't change the way the scripts are executed, and running them -x will show me that the binaries should have been called (but maybe they weren't on the path, or something?) and i would have to parse it from the console. it's not that this is impossible, but it's less convenient, IMO. i want to start the automated test, have the system boot, run whatever it runs, then have the test assert that things i expected to happen, happened. – nathan g Jun 16 '15 at 10:23
  • I would call that a `wrapper`. It is quite hack, but you can do it like this... http://stackoverflow.com/a/24202568/2836621 – Mark Setchell Jun 16 '15 at 10:30
  • @markSetchell that's exactly what i'm looking for, i even wrote that as a possible approach to the problem, i just wondered if there was something ready of an alternative. it's great you referenced this as a feasible implemntation. thank you. if you want you can write that as an answer and i'll accept. i don't think this is a duplicate, i think this is another use case with the same solution. – nathan g Jun 16 '15 at 10:46

1 Answers1

0

Just an idea, I didn't see this anywhere but:

Unless you're using full paths to invoke those binaries, you could create mocks of those libraries, e.g., in your projects bin/ directory and make that directory be the first in your $PATH.

export PATH="$PWD/bin:$PATH"

To mock grep, for example, you could do:

A helper executable to increment counts:

#!/bin/bash
#Usage: increment FILE_WITH_A_NUMBER
touch "$1" #To create it if it doesn't exist
NTIMES=$(( $(cat "$1") + 1 ))
echo "$NTIMES" > "$1"

A mock template (in bin/grep):

#!/bin/bash

increment ${BASH_SOURCE[0]}.log    #=> bin/grep.log

#Possibly do some other stuff, such as log parameters

#Return a mocked result
echo "fake match"

#Or delegate to the real thing:
exec /usr/bin/grep "$@"
Petr Skocik
  • 58,047
  • 6
  • 95
  • 142
  • 1
    this is what i'm talking about precisely, and very similar to how i thought of implementing it. i'm still wandering if maybe there's a "spy" cmdline utility that does all of this automatically i'll probably write one if there isn't, cause i wouldn't want to do this manually each time... – nathan g Jun 16 '15 at 10:52
  • Of course you shouldn't do it manually each time. The above solution is perfectly scriptable. – Petr Skocik Jun 16 '15 at 10:56