2

I have a bash script with the following function:

function curl_the_URL_and_check_status(){

   status=$(curl $1 | grep "X-Cache-Status:" | cut -d " " -f 2)

   if [[ "$status" != *"MISS"* ]]; then
       echo "
   cURL returned non MISS status. Something is not correct. Exiting with status 1
   check the status by entering curl $1"
       exit 1
   fi
}

Passing the parameters:

## My comment :: .... some more output ....
+++ grep -o -P '(?<=\/\/).*?(?=\/)'
++ host=php-mindaugasb.c9.io
+++ echo http://php-mindaugasb.c9.io/Testing/JS/displayName.js
+++ perl -pe 's|(?<=://).+?(?=/)|localhost:805|'
++ modified_URL=http://localhost:805/Testing/JS/displayName.js

## My comment ::  below is the parameter passed to the function as $1
++ cURL_string='-H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js'

I get this passed to curl:

++ echo -H '"Host:' 'php-mindaugasb.c9.io"' -I -s http://localhost:805/Testing/JS/displayName.js

Which does not work trying from console (gateway timeout error is thrown).

So my curl looks like this:

curl -H '"Host:' 'php-mindaugasb.c9.io"' -I -s http://localhost:805/Testing/JS/displayName.js

I need it to look like this (which works when tested from console):

curl -H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js

How do I make this happen?

Tried with `curl $1`, curl "$1" ...

Thanks

ADDENDUM

I call the function like this:

 # another function that constructs correct CURL string
 cURL_string="\"-H Host: $host\" -I -s $modified_URL"

 # global scope - this is where the curl is called
 curl_params=$(get_prepared_string_for_cURL $1)
 curl_the_URL_and_check_status $curl_params

(UPDATE: 14 01 2015)

Here is what I get using array approach:

cURL_string=(-H \"Host: $host\" -I -s $modified_URL)

CASES:

curl "${curl_params[@]}" ==> curl '-H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js'

curl: no URL specified!

curl ${curl_params[@]} ==> curl -H '"Host:' 'php-mindaugasb.c9.io"' -I -s http://localhost:805/Testing/JS/displayName.js

I need

curl -H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js

get_prepared_string_for_cURL

function get_prepared_string_for_cURL(){

    # get the host from URL, to use with in curl with the --Host flag
    host=$(echo $1 | grep -o -P '(?<=\/\/).*?(?=\/)')

    # replace the host part with the "localhost:805" to request the resource
    # from the nginx virtual host (server block) dedicated for proxy cache
    modified_URL=$(echo $1 | perl -pe 's|(?<=://).+?(?=/)|localhost:805|')

    # construct cURL string
    cURL_string=(-H Host: $host -I -s $modified_URL)

    # echo "$cURL_string"

    echo "${cURL_string[@]}"
}
Mindaugas Bernatavičius
  • 3,757
  • 4
  • 31
  • 58

2 Answers2

4

The shell parses quotes before substituting variable references (e.g. $1), so if there are quotes in the value of $1, by the time they're in place it's too late for them to do anything useful. Rather than passing the curl arguments as a single argument with quotes embedded, pass it as a series of arguments and use "$@" to expand it:

function curl_the_URL_and_check_status(){

   status=$(curl "$@" | grep "X-Cache-Status:" | cut -d " " -f 2)
[...]

...and then call it with something like:

curl_the_URL_and_check_status -H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js

instead of:

curl_the_URL_and_check_status '-H "Host: php-mindaugasb.c9.io" -I -s http://localhost:805/Testing/JS/displayName.js'

But it looks like you're also building the parameter list in a variable, which causes exactly the same problem -- there's no good way to take a plain variable and split it into arguments based on embedded quotes. Again, there's a solution: use an array, with each argument being an element of the array. Then, reference the array as "${arrayname[@]}" so each element gets treated as a separate argument.

cURL_args=(-H "Host: php-mindaugasb.c9.io" -I -s "$modified_URL")
curl_the_URL_and_check_status "${cURL_args[@]}"
Gordon Davisson
  • 118,432
  • 16
  • 123
  • 151
  • I don't know If I understand your answer correctly, but let me take this statement "there's no good way to take a plain variable and split it into arguments based on embedded quotes" - I'm not trying to split a variable that was passed I'm trying to construct curl parameters inside a function, and then pass them to curl from that function using echo :) Please see my update above (UPDATE: 14 01 2015) -- tell me if your answer should handle that as well – Mindaugas Bernatavičius Jan 14 '15 at 13:26
  • Seems like bash adds single qoutes (curl '-H Host x -I ') when I do the assignment (and this is no matter what - I can pass array, or a variable) - this is the statement: "curl_params=$(get_prepared_string_for_cURL $1)" so I simply bypass the assignment by building up the parameters inside a function but creating a string in global scope - is this a good practice or should be avioded? I know it is less clear (at least it seems). – Mindaugas Bernatavičius Jan 15 '15 at 09:42
  • @MindaugasBernatavičius If I understand your update right, the problem is that you escaped the quotes when creating the array -- use `cURL_string=(-H "Host: $host" -I -s $modified_URL)` instead of `cURL_string=(-H \"Host: $host\" -I -s $modified_URL)`. With the quotes escaped, they're treated as part of the arguments rather than delimiters around an argument. Then use double-quotes when you reference the array (see my last example). – Gordon Davisson Jan 16 '15 at 05:53
  • Oh, and the reason you're seeing single-quotes is that's bash's way of unambiguously displaying the arguments. It's not "really" there, but bash has to add something to indicate that there are double-quotes embedded in some of the arguments (which you didn't want) rather than around them (which you do want). After you fix the problem, the single-quotes will still be there, but they'll be in place of the double-quotes (rather than in addition to them). – Gordon Davisson Jan 16 '15 at 05:56
  • And one final note: you say you're not trying to split a variable, but that's actually exactly what you were doing. Each argument to `curl` is a string, so the arguments to `curl` are a *list of* strings. But a normal shell variable can only store a single string, so in order to turn that into a list of strings (arguments), it has to be split up. And that splitting is where all the mess comes in. An array solves this because it holds... several separate strings. No splitting needed, they're already in basically the right form. – Gordon Davisson Jan 16 '15 at 06:03
0

you could use eval to interpret variables in a command string first and then execute it

UPDATE:

eval [arg ...]
          The args are read and concatenated together into a single command.  This command is then read and  exe‐
          cuted  by  the  shell,  and its exit status is returned as the value of eval.  If there are no args, or
          only null arguments, eval returns 0.

so you could build a command string like command="curl -H \"Content-Type: whatever\" $1 $2" afterwards run

eval ${command}

eval will read and interpret all your quotations, escapes and variables first and then run the interpreted command. hope this helps. greets.

Marc Bredt
  • 905
  • 5
  • 13
  • Thanks, but can you add some details please? – Mindaugas Bernatavičius Jan 13 '15 at 17:34
  • Please don't use `eval` unless you know *exactly* what you're doing. It'll interpret *everything* in the string, whether you wanted it interpreted or not. It has a well-deserved reputation for causing nasty bugs... – Gordon Davisson Jan 13 '15 at 17:50
  • @GordonDavisson do you have some more information on those "nasty bugs". i would be interested in just to know exactly what i do avoid while not using `eval` anymore. – Marc Bredt Jan 13 '15 at 17:53
  • 1
    @EmilKakkau: The basic problem is that it blurs the line between data and executable code (even further than it usually is in the shell), and can end up executing something you thought was data. How dangerous it is depends on how predictable the command you're executing it; the less control you have over it, the less dangerous it is (but if it's highly predictable, why do you need `eval` anyway?). Suppose, for example, the URL you're going to fetch contains data from an untrusted source -- if the URL contains `$(rm -R *)`, the eval command will execute `rm -R *`! – Gordon Davisson Jan 13 '15 at 18:02
  • @EmilKakkau: Here are some links that discuss it: [BashFAQ #48](http://mywiki.wooledge.org/BashFAQ/048) and [an SO question on the subject](http://stackoverflow.com/questions/17529220/why-should-eval-be-avoided-in-bash-and-what-should-i-use-instead), and [a made-up but extreme example](http://stackoverflow.com/questions/9308606/usr-bin-find-cannot-build-its-arguments-dynamically/9322047#9322047). – Gordon Davisson Jan 13 '15 at 18:07
  • @GordonDavisson ok i see the point when passing data unchecked, if not fully controllable. here i just wanted to point at the concatenation and execution that was asked for. i rarely remember i used eval for example when setting up some dynamic variables like `foo_$varsuffix=foo` where i knew what `$varsuffix` looks like. anyway, thanks for putting security aspects in my mind. – Marc Bredt Jan 13 '15 at 18:12
  • @EmilKakkau I wouldn't just consider it a security issue, it's just easiest to explain that way. `eval` tends to create bugs that an attacker can exploit, *and* that normal users will stumble into at random. Mind you, I mostly script OS X, so I've gotten used to the fact that users will put all kinds of random characters in filenames. Not out of malice, just because `John's spreadsheet *#3*!.txt` seemed like a reasonable name at the time. – Gordon Davisson Jan 16 '15 at 06:12
  • Which kind of impact could this filename produce in combination with using eval? At the moment it is hidden to if there is one. Could please give more details about that? – Marc Bredt Jan 16 '15 at 07:23