0

Following command

"find . -type f -regextype posix-extended -regex './ctrf.|./rbc.' -exec basename {} ;"

And executing it.

I am stroring the command in variable in shell script link

Find_Command=$1

For Execution

Files="$(${Find_Command})"

Not working.

Stark
  • 5
  • 4
  • What is happening instead of working? Do you get an error message? – Guss Feb 11 '23 at 06:55
  • No Error and no result. – Stark Feb 11 '23 at 06:57
  • I am passing the command line arguments with java to the script which is running on remote server. – Stark Feb 11 '23 at 07:07
  • 2
    See [BashFAQ #50](http://mywiki.wooledge.org/BashFAQ/050), and specifically **5. I'm constructing a command based on information that is only known at run time** and see [**BashFAQ #48: Eval command and security issues**](http://mywiki.wooledge.org/BashFAQ/048) – David C. Rankin Feb 11 '23 at 08:59
  • @DavidC.Rankin it there any better way than eval to use this? – Stark Feb 11 '23 at 09:37
  • 1
    Why do you need to pass an arbitrary shell command to your script in the first place? If your script is supposed to find files, why should I be allowed to specify `rm -rf` (as an extreme example) as an argument? Define a proper interface that lets me specify the *information* you need to find the necessary files, instead of making me construct such a command. – chepner Feb 11 '23 at 13:04
  • 1
    You can't store a list of filenames in a string at all, because there's no character that can exist in a string-type variable and not in a UNIX path. Even newlines are legal in UNIX filenames. And converting something that's valid as a C string to an array has the same problems. – Charles Duffy Feb 11 '23 at 13:52
  • To do this safely, you do something like `mapfile -d '' Files < <("${Find_Command[@]}")` where your find command is an array, and the action it performs is `-print0` – Charles Duffy Feb 11 '23 at 13:53
  • 1
    BTW, what's the point of the regex? It might be a lot more efficient, depending on what you're intending to accomplish, to use something like `find . -mindepth 1 '(' -type d -name 'ctrf.*' -o -name 'rbc.*' -o -prune ')' -printf '%f\n'` -- though I can't test that because I don't have samples of the paths you do and don't want to find. (`-prune` tells find to stop recursing down subtrees altogether, so it's much more efficient than recursing through a tree and then filtering out all the results that came from following that path). – Charles Duffy Feb 11 '23 at 13:58

4 Answers4

1

Best Practice: Accept An Array, Not A String

First, your shell script should take the command to run as a series of separate arguments, not a single argument.

#!/usr/bin/env bash

readarray -d '' Files < <("$@")
echo "Found ${#Files[@]} files" >&2
printf ' - %q\n' "${Files[@]}"

called as:

./yourscript find . -type f -regextype posix-extended -regex './ctrf.*|./rbc.*' -printf '%f\0'

Note that there's no reason to use the external basename command: find -printf can directly print you only the filename.


Fallback: Parsing A String To An Array Correctly

If you must accept a string, you can use the answers in Reading quoted/escaped arguments correctly from a string to convert that string to an array safely.

Compromising complete shell compatibility to avoid needing nonstandard tools, we can use xargs:

#!/usr/bin/env bash

readarray -d '' Command_Arr < <(xargs printf '%s\0' <<<"$1")
readarray -d '' Files < <("${Command_Arr[@]}")
echo "Found ${#Files[@]} files" >&2
printf ' - %q\n' "${Files[@]}"

...with your script called as:

./yourscript $'find . -type f -regextype posix-extended -regex \'./ctrf.*|./rbc.*\' -printf \'%f\\0\''
Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • This won't work as he can only pass a single sh-c-argument like argument. – konsolebox Feb 11 '23 at 14:57
  • That's what he _asked how to_ pass, but I don't see anywhere in the question that's specified as the only available option, vs the only option the OP knew to exist. – Charles Duffy Feb 11 '23 at 15:21
  • He hinted with "I am passing the command line arguments with java to the script which is running on remote server.". – konsolebox Feb 11 '23 at 22:33
  • You can't "convert that string to an array safely.". Solutions like `compgen -W` and `xargs` still give a result even with syntax errors like unmatched quotes. `eval` is still the one most appropriate for this and converting a command to an array of arguments before executing it doesn't really give a safety benefit. This suggestion should also mention that it will only parse simple commands. Commands having redirections will not work as expected. Allowing commands to execute in an unexpected manner is also as destructive. A few "at least" arguments will not justify it. – konsolebox Feb 11 '23 at 23:07
  • @konsolebox, xargs exits with a nonzero status if given input with mismatched quotes, though granted there's a bit of work we need to do to detect that from outside the process expansion. And yes, only handling simple commands is an intentional part of the point; reduces the set of Things That Happen Behind Your Back. – Charles Duffy Feb 12 '23 at 12:59
  • Except this doesn't happen behind OP's back since he literally is the one giving the command. The argument splitting is a pointless religious ritual. – konsolebox Feb 12 '23 at 15:17
  • I'm curious though, what exactly are the unexpected behavior that may happen in an eval operation that an array spliiting operation method can mitigate? Mind that execution happens in a subshell so worrying about syntax effects granting that eval allows outer constructs to be affected is pointless. – konsolebox Feb 12 '23 at 15:33
  • There are many valid use case for argument splitting like in completion functions but this simply isn't one. If your reason is to just avoid eval no matter the case, that's not being objective and simply a form of fearing the unknown when eval's behavior is simple and completely predictable. – konsolebox Feb 12 '23 at 15:37
  • @konsolebox, eval's behavior is "simple and completely predictable" when there's no parameterization, but that's rarely true in the real world. We write `"find . -type d ..."` in an answer, and then someone actually _deploying_ that answer writes `"find $topdir -type d ..."` and puts themselves in a world of hurt. Splitting into an array only _mitigates_ that hurt (whereas avoiding it outright would require the never-a-string approach), but it's better than subjecting themselves to the effects of `for topdir in /uploads/*/; do ...` when some bad actor might have created `/uploads/$(rm -rf ~)`. – Charles Duffy Feb 13 '23 at 21:38
  • You are giving an example that is out of context and possibly bad use of eval. Are you supposed to do something as `eval "do_something $topdir"`` when it's supposed to be `eval "do_something \"\${topdir}\""` or `eval "do_something ${topdir@Q}"`? Then that's a user problem and not actually eval's. Eval is supposed to execute a raw command expressed by the expanded word plain and simple. Why is that supposed to be difficult to be kept in mind? Now instead of telling people that, tell them to avoid eval instead like a religion. – konsolebox Feb 13 '23 at 23:15
  • And basically your argument is because people might misuse eval instead of what is the proper solution for the OP's problem. – konsolebox Feb 13 '23 at 23:19
0

If you want to run a command specified in a variable and save the output in another variable, you can use following commands. command="find something" output=$($command)
Or if you want to store output in array:
typeset -a output=$($command)

However, storing filenames in variables and then attempting to access files with those filenames is a bad idea because it is impossible to set the proper delimiter to separate filenames because filenames can contain any character except NUL (see https://mywiki.wooledge.org/BashPitfalls).

I'm not sure what you're trying to accomplish, but your find command contains an error. The -exec option must end with ; to indicate the end of the -exec parameters. Aside from that, it appears to be 'The xy problem' see https://xyproblem.info/ If you want to get basename of regular files with the extension .ctrf or.rbc, use the bash script below.
for x in **/*.+(ctrf|rbc); do basename $x ; done Or zsh script
basename **/*.(ctrf|rbc)(#q.)
Make sure you have enabled 'extended glob' option in your shell. To enable it in bash run following comand. shopt -s extglob And for zsh setopt extendedglob

-1

You should use array instead of string for Find_Command :

#!/usr/bin/env bash

Find_Command=(find . -type f -regextype posix-extended -regex '(./ctrf.|./rbc.)' -exec basename {} \;)

Files=($(“${Find_Command[@]}”))

Second statement assumes you don't have special characters (like spaces) in your file names.

Philippe
  • 20,025
  • 2
  • 23
  • 32
  • Missing quotes. Should be `Files=( $("${Find_Command[@]}") )` – Charles Duffy Feb 11 '23 at 13:50
  • However, even that is bad practice; see [BashPitfalls #50](https://mywiki.wooledge.org/BashPitfalls#hosts.3D.28_.24.28aws_.2BICY.29_.29). – Charles Duffy Feb 11 '23 at 13:51
  • (Right now, this is word-splitting the command because of the lack of quotes, and then -- because there's no command substitution -- storing _the command itself_, not its output, in the `Files` array) – Charles Duffy Feb 11 '23 at 14:02
-2

Use eval:

Files=$(eval "${Find_Command}")

Be mindful of keeping the parameter sanitized and secure.

konsolebox
  • 72,135
  • 12
  • 99
  • 105