I have a requirement where i have to pass a filter condition as a param to a shell bash script
eg "col IN ('abc')"
This works fine, when i directly pass it via spark submit. I want to create a bash shell script and pass it via the shell script like
./shell.sh "col IN ('abc')"
Now, the problem i am facing in above shell script approach is, even though i pass the param in double quotes but still the bash-4.1 splits it into multiple params which further breaks my spark code.
I know i can backslash the param quotes in the shell script and feed it to EVAL function and make it work, becoz below worked for me, but the issue is, i have to give this 1 line script to a non-programming person (he knows basic SQL only) and dont want to confuse him with managing backslashes everytime he runs the program.
./shell.sh "\"col IN ('abc')\""
i tried lot of manipulation as well on the param and then setting it again via
set -- param
But again bash splits it up into multiple params. Is there a easy way to fix this in the shell script ??
Please help.