27

What I'd like to do is take, as an input to a function, a line that may include quotes (single or double) and echo that line exactly as it was provided to the function. For instance:

function doit {
   printf "%s " ${@} 
   eval "${@}"
   printf " # [%3d]\n" ${?}
}

Which, given the following input

doit VAR=42
doit echo 'single quote $VAR'
doit echo "double quote $VAR"

Yields the following:

VAR=42  # [  0]
echo single quote $VAR  # [  0]
echo double quote 42  # [  0]

So the semantics of the variable expansion are preserved as I'd expect, but I can not get the exact format of the line as it was provided to the function. What I'd like is to have doit echo 'single quote $VAR' result in echo 'single quote $VAR'.

I'm sure this has to do with bash processing the arguments before they are passed to the function; I'm just looking for a way around that (if possible).

Edit

So what I had intended was to shadow the execution of a script while providing an exact replica of the execution that could be used as a diagnostic tool including exit status of each step.

While I can get the desired behavior described above by doing something like

while read line ; do 
   doit ${line}
done < ${INPUT}

That approach fails in the face of control structures (i.e. if, while, etc). I thought about using set -x but that has it's limitations as well: " becomes ' and exit status is not visible for commands that fail.

Rob Bednark
  • 25,981
  • 23
  • 80
  • 125
ezpz
  • 11,767
  • 6
  • 38
  • 39
  • Related: [How do I use a Bash variable (string) containing quotes in a command?](https://superuser.com/q/360966/11574) –  May 30 '18 at 16:19
  • Another related: [Preserve Quotes in bash arguments](https://stackoverflow.com/questions/10835933/preserve-quotes-in-bash-arguments) – Helder Pereira Mar 24 '20 at 21:06
  • Because `eval` combines all its arguments into a single string before parsing them, `eval "$@"` is _exactly_ the same as `eval "$*"`, just more misleading (insofar as a reader could incorrectly infer that the argument boundaries are preserved). – Charles Duffy Aug 12 '22 at 12:55

8 Answers8

11

I was in a similar position to you in that I needed a script to wrap around an existing command and pass arguments preserving quoting.

I came up with something that doesn't preserve the command line exactly as typed but does pass the arguments correctly and show you what they were.

Here's my script set up to shadow ls:

CMD=ls
PARAMS=""

for PARAM in "$@"
do
  PARAMS="${PARAMS} \"${PARAM}\""
done

echo Running: ${CMD} ${PARAMS}
bash -c "${CMD} ${PARAMS}"
echo Exit Code: $?

And this is some sample output:

$ ./shadow.sh missing-file "not a file"
Running: ls "missing-file" "not a file"
ls: missing-file: No such file or directory
ls: not a file: No such file or directory
Exit Code: 1

So as you can see it adds quotes which weren't originally there but it does preserve arguments with spaces in which is what I needed.

David Webb
  • 190,537
  • 57
  • 313
  • 299
  • 4
    This may be OK in superficial situations, but you really need to [use arrays for collecting parameters](http://mywiki.wooledge.org/BashFAQ/050). – glenn jackman Jul 30 '13 at 00:43
7

The reason this happens is because bash interprets the arguments, as you thought. The quotes simply aren't there any more when it calls the function, so this isn't possible. It worked in DOS because programs could interpret the command line themselves, not that it helps you!

Peter Westlake
  • 4,894
  • 1
  • 26
  • 35
5

Although @Peter Westlake's answer is correct, and there are no quotes to preserve one can try to deduce if the quotes where required and thus passed in originally. Personally I used this requote function when I needed a proof in my logs that a command ran with the correct quoting:

function requote() {
    local res=""
    for x in "${@}" ; do
        # try to figure out if quoting was required for the $x:
        grep -q "[[:space:]]" <<< "$x" && res="${res} '${x}'" || res="${res} ${x}"
    done
    # remove first space and print:
    sed -e 's/^ //' <<< "${res}"
}

And here is how I use it:

CMD=$(requote "${@}")
# ...
echo "${CMD}"
Chen Levy
  • 15,438
  • 17
  • 74
  • 92
  • 1
    This doesn't currently handle arguments with literal single-ticks inside their contents; think about `$'\'"$(rm -rf $HOME)"\''`. Also, `$(foo)` doesn't contain any whitespace, but it still needs to be quoted to be safe. – Charles Duffy May 20 '16 at 16:53
  • 2
    Much better would be to use `printf %q`, which the shell guarantees will generate `eval`-safe output. – Charles Duffy May 20 '16 at 16:55
  • Also, `printf '%s\n' "${res# }"` is a much more efficient way to do the remove-first-space-and-print thing. And `[[ $x = *[[:space:]]* ]]` is a much more efficient way to check for whether a variable contains whitespace. – Charles Duffy May 20 '16 at 17:09
  • I'm still coming up to speed on high-end bash, but doesn't `for x in "${@}" ; do` only iterate once for exactly the reason we've been discussing? Won't it gather all the parameters up into a single string with spaces? I've been working on some functions that pass along parameters (which is why I'm reading this thread). – cycollins Jun 25 '19 at 18:02
  • 1
    [@cycollins](https://stackoverflow.com/users/8611540), in the context of this question `"$@"` is only part of the solution. Each time Bash evaluates some expression, it strips off the outer most quotes. And while `"$@"` preserves the spacing, it can't preserve the quotes itself, since it was already stripped away. A good example for this can be found in [this SO ansere](https://stackoverflow.com/a/12316565/). – Chen Levy Jun 30 '19 at 05:50
4
doit echo "'single quote $VAR'"
doit echo '"double quote $VAR"'

Both will work.

bash will only strip the outside set of quotes when entering the function.

hkf
  • 4,440
  • 1
  • 30
  • 44
Peter
  • 41
  • 1
2

Bash will remove the quote when you pass a string with quote in as command line argument. The quote is simply not there anymore when the string is pass to your script. You have no way to know there is a single quote or double quote.

What you probably can do is sth like this:

doit VAR=42
doit echo \'single quote $VAR\'
doit echo \"double quote $VAR\"

In your script you get

echo 'single quote $VAR'
echo "double quote $VAR"

Or do this

doit VAR=42
doit echo 'single quote $VAR'
doit echo '"double quote $VAR"'

In your script you get

echo single quote $VAR
echo "double quote $VAR"
ttchong
  • 307
  • 2
  • 9
1

This:

ponerApostrofes1 () 
{
    for (( i=1; i<=$#; i++ ));
    do
        eval VAR="\${$i}"; 
        echo \'"${VAR}"\';
    done; 
    return; 
}

As an example has problems when the parameters have apostrophes.

This function:

ponerApostrofes2 () 
{ 
    for ((i=1; i<=$#; i++ ))
    do
        eval PARAM="\${$i}";
        echo -n \'${PARAM//\'/\'\\\'\'}\'' ';
    done;
    return
}

solves the mentioned problem and you can use parameters including apostrophes inside, like "Porky's", and returns, apparently(?), the same string of parameters when each parameter is quoted; if not, it quotes it. Surprisingly, I don't understand why, if you use it recursively, it doesn't return the same list but each parameter is quoted again. But if you do echo of each one you recover the original parameter.

Example:

$ ponerApostrofes2 'aa aaa' 'bbbb b' 'c' 
'aa aaa' 'bbbb b' 'c'

$ ponerApostrofes2 $(ponerApostrofes2 'aa aaa' 'bbbb b' 'c' )
''\''aa' 'aaa'\''' ''\''bbbb' 'b'\''' ''\''c'\''' 

And:

$ echo ''\''bbbb' 'b'\'''
'bbbb b'
$ echo ''\''aa' 'aaa'\'''
'aa aaa'
$ echo ''\''c'\''' 
'c'

And this one:

ponerApostrofes3 () 
{ 
    for ((i=1; i<=$#; i++ ))
    do
        eval PARAM="\${$i}";
        echo -n ${PARAM//\'/\'\\\'\'} ' ';
    done;
    return
}

returning one level of quotation less, doesn't work either, neither alternating both recursively.

Chen Levy
  • 15,438
  • 17
  • 74
  • 92
1

If one's shell does not support pattern substitution, i.e. ${param/pattern/string} then the following sed expression can be used to safely quote any string such that it will eval into a single parameter again:

sed "s/'/'\\\\''/g;1s/^/'/;\$s/\$/'/"

Combining this with printf it is possible to write a little function that will take any list of strings produced by filename expansion or "$@" and turn it into something that can be safely passed to eval to expand it into arguments for another command while safely preserving parameter separation.

# Usage: quotedlist=$(shell_quote args...)
#
# e.g.:  quotedlist=$(shell_quote *.pdf)    # filenames with spaces
#
# or:    quotedlist=$(shell_quote "$@")
#
# After building up a quoted list, use it by evaling it inside
# double quotes, like this:
#
#   eval "set -- $quotedlist"
#   for str in "$@"; do
#       # fiddle "${str}"
#   done
#
# or like this:
#
#   eval "\$a_command $quotedlist \$another_parameter"
#
shell_quote()
{
    local result=''
    local arg
    for arg in "$@" ; do

        # Append a space to our result, if necessary
        #
        result=${result}${result:+ }

        # Convert each embedded ' to \' , then insert ' at the
        # beginning of the line, and append ' at the end of
        # the line.
        #
        result=${result}$(printf "%s\n" "$arg" | \
            sed -e "s/'/'\\\\''/g" -e "1s/^/'/" -e "\$s/\$/'/")
    done

    # use printf(1) instead of echo to avoid weird "echo"
    # implementations.
    #
    printf "%s\n" "$result"
}

It may be easier (and maybe safer, i.e. avoid eval) in some situations to use an "impossible" character as the field separator and then use IFS to control expansion of the value again.

Greg A. Woods
  • 2,663
  • 29
  • 26
0

The shell is going to interpret the quotes and the $ before it passes it to your function. There's not a lot your function can do to get the special characters back, because it has no way of knowing (in the double-quote example) whether 42 was hard-coded or if it came from a variable. You will have to escape the special characters if you want them to survive long enough to make it to your function.

bta
  • 43,959
  • 6
  • 69
  • 99