0

I have a requirement where i have to pass a filter condition as a param to a shell bash script

eg "col IN ('abc')"

This works fine, when i directly pass it via spark submit. I want to create a bash shell script and pass it via the shell script like

./shell.sh "col IN ('abc')"

Now, the problem i am facing in above shell script approach is, even though i pass the param in double quotes but still the bash-4.1 splits it into multiple params which further breaks my spark code.

I know i can backslash the param quotes in the shell script and feed it to EVAL function and make it work, becoz below worked for me, but the issue is, i have to give this 1 line script to a non-programming person (he knows basic SQL only) and dont want to confuse him with managing backslashes everytime he runs the program.

./shell.sh "\"col IN ('abc')\""

i tried lot of manipulation as well on the param and then setting it again via

set -- param

But again bash splits it up into multiple params. Is there a easy way to fix this in the shell script ??

Please help.

username89
  • 51
  • 7
  • 1
    `for i in $*` is innately, irresolvably buggy. It serves no useful purpose and should never be run by anyone. – Charles Duffy Jul 20 '21 at 22:43
  • If you want to list your arguments, one per line, that's `printf '%s\n' "$@"`, or `for i in "$@"; do echo "$i"; done` – Charles Duffy Jul 20 '21 at 22:43
  • 1
    ...that is to say: `"$*"` takes all your arguments, and mashes them together into a single string. Then when you use `$*` _unquoted_, that string gets split out into individual words again. Thus, how `for i in $*` behaves tells you **nothing at all** about whether you originally had a single argument or three separate ones, because you get the same undesired behavior either way. – Charles Duffy Jul 20 '21 at 22:45
  • ...to be clear, you may well have a real problem, but the [mre] given doesn't demonstrate it, because it would show the same output even if your parameter-passing were 100% perfect. – Charles Duffy Jul 20 '21 at 22:47
  • (```shell.sh "\"col IN ('abc')\""``` **shouldn't** fix anything, because the inner `"`s are literal; for them to be syntactic something needs to be `eval`ing your data, or passing it in a context where it's treated as code to an inner shell, or doing something else comparably nasty; but you aren't showing us the code where any of those things happens, so we can't help you fix it; the problem you _are_ giving us enough information to fix is the problem in the test case itself, caused by using `$*` instead of `"$@"`). – Charles Duffy Jul 20 '21 at 22:48
  • sorry, i wasnt clear on my question. U r right, $@ prints the param in 1 line with whitespaces as well. Actually, (shell.sh "\"col IN ('abc')\"" this i tried via eval, feeding this into EVAL. Thing is, $@ is printing the param in 1 line, no issues, but the same $@ i am passing to the spark-submit as well. When i print the args array received within my spark program, the "col IN ('abc')" is splitted into further multiple params. Sorry, if i am still not clear, as to i am also not sure, why its happening. – username89 Jul 21 '21 at 00:31
  • i have edited my question, since $@ works fine and is printing whitespaces args as well inside the unix program in 1 line. But the same $@ fed to the spark submit, splits the whitespaces param into multiple params, causing issue for me in the program, pls help. – username89 Jul 21 '21 at 00:39
  • i resolved it, i was passing $@ to the spark submit which split the params. After passing "$@", i was able to get the whitespace params as well in 1 single param, wheeww, very weird, but solved it, thanks a lot @CharlesDuffy for the help :) Really appreciated :) – username89 Jul 21 '21 at 00:57

0 Answers0