77

how can I execute a shell command in the background from within a bash script, if the command is in a string?

For example:

#!/bin/bash
cmd="nohup mycommand";
other_cmd="nohup othercommand";

"$cmd &";
"$othercmd &";

this does not work -- how can I do this?

4 Answers4

75

Building off of ngoozeff's answer, if you want to make a command run completely in the background (i.e., if you want to hide its output and prevent it from being killed when you close its Terminal window), you can do this instead:

cmd="google-chrome";
"${cmd}" &>/dev/null & disown;
  • &>/dev/null sets the command’s stdout and stderr to /dev/null instead of inheriting them from the parent process.
  • & makes the shell run the command in the background.
  • disown removes the “current” job, last one stopped or put in the background, from under the shell’s job control.

In some shells you can also use &! instead of & disown; they both have the same effect. Bash doesn’t support &!, though.

Also, when putting a command inside of a variable, it's more proper to use eval "${cmd}" rather than "${cmd}":

cmd="google-chrome";
eval "${cmd}" &>/dev/null & disown;

If you run this command directly in Terminal, it will show the PID of the process which the command starts. But inside of a shell script, no output will be shown.

Here's a function for it:

#!/bin/bash

# Run a command in the background.
_evalBg() {
    eval "$@" &>/dev/null & disown;
}

cmd="google-chrome";
_evalBg "${cmd}";

Also, see: Running bash commands in the background properly

sainaen
  • 1,498
  • 9
  • 18
GreenRaccoon23
  • 3,603
  • 7
  • 32
  • 46
  • 1
    @user248237dfsf @GreenRaccoon23 `eval "${cmd}"` is surely much better than just `${cmd}`. I believe that's what you meant as `"${cmd}"` will fail for cases like: `cmd='ls -l'`. Also, `${cmd}` itself is not the perfect solution as it will fail for cases using expansions done before parameter expansion. So for example for `cmd='touch file{1..5}'`. – PesaThe Nov 24 '17 at 18:22
  • `eval "$@"` is buggy -- if someone expects `"$@"` to ensure that arguments are passed separately from each other, using `eval` defeats the point (by concatenating all its arguments together into a single string, and then evaluating it). Better to *either* use `"$@"` (expecting arguments to be pre-split), or `eval "$1"` (expecting exactly one argument formatted for parsing). – Charles Duffy Dec 29 '17 at 00:03
  • @CharlesDuffy What's an example of when `eval "$@"` is buggy? I can't think of one. I can think of cases where `"$@"` would be buggy though, like indirect variables. `eval "${1}"` would ignore multiple arguments too, as you pointed out. `eval "$@"` would handle both of those edge cases. I do agree that `eval "${1}"` should be enough though because your code should be consistent in the way it calls that function. Requiring the quotes in `_evalBg "${cmd}"` instead of `_evalBg ${cmd}` will make the code more manageable in the long run. – GreenRaccoon23 Dec 29 '17 at 01:01
  • @GreenRaccoon23, as a demonstrative case, consider `set -- printf '%s\n' "first argument" "second argument"` -- `"$@"` will work on its own, `eval "$@"` won't. – Charles Duffy Dec 29 '17 at 02:45
  • @GreenRaccoon23, ...and yes, I'm positioning ignoring multiple arguments as a feature, not a bug, on the rationale that it's better to not support a case at all than to support it badly: Either accept a single string with code to run (`eval "$1"`), or accept a list of arguments (`"$@"`); but `eval "$@"` is behaving the exact same way as `eval "$*"` would, with all the bugs that implies. – Charles Duffy Dec 29 '17 at 02:47
  • @GreenRaccoon23, (...`for arg; do eval "$arg"; done &` would be reasonable too -- accepting a list of separate shell commands to be individually executed in a single background process -- but "concatenate all our arguments and parse the resulting string" is just error-prone). – Charles Duffy Dec 29 '17 at 04:07
74

Leave off the quotes

$cmd &
$othercmd &

eg:

nicholas@nick-win7 /tmp
$ cat test
#!/bin/bash

cmd="ls -la"

$cmd &


nicholas@nick-win7 /tmp
$ ./test

nicholas@nick-win7 /tmp
$ total 6
drwxrwxrwt+ 1 nicholas root    0 2010-09-10 20:44 .
drwxr-xr-x+ 1 nicholas root 4096 2010-09-10 14:40 ..
-rwxrwxrwx  1 nicholas None   35 2010-09-10 20:44 test
-rwxr-xr-x  1 nicholas None   41 2010-09-10 20:43 test~
ngoozeff
  • 4,576
  • 3
  • 28
  • 21
  • 2
    thanks. it's necessary to leave off the semicolons too as far as i can tell. –  Sep 10 '10 at 10:50
  • 1
    Also make sure you do **not** put quotations around `$cmd`. If you do, it will try to run a command called `"ls -la"` instead of `ls` with the switches `-la`. – GreenRaccoon23 Apr 16 '15 at 16:41
  • 3
    `cmd='some-command "with arguments"'; $cmd` is not **at all** the same as `some-command "with arguments"`. It'll fail if you have quoting in your argument list; it'll fail if you need to control when globs are and are not expanded; it'll fail if running code that depends on brace expansion or parameter expansion at execution time. See [BashFAQ #50](http://mywiki.wooledge.org/BashFAQ/050): *I'm trying to put a command in a variable, but complex cases always fail!*. – Charles Duffy Dec 29 '17 at 00:00
3

This works because the it's a static variable. You could do something much cooler like this:

filename="filename"
extension="txt"
for i in {1..20}; do
    eval "filename${i}=${filename}${i}.${extension}"
    touch filename${i}
    echo "this rox" > filename${i}
done

This code will create 20 files and dynamically set 20 variables. Of course you could use an array, but I'm just showing you the feature :). Note that you can use the variables $filename1, $filename2, $filename3... because they were created with evaluate command. In this case I'm just creating files, but you could use to create dynamically arguments to the commands, and then execute in background.

jyz
  • 6,011
  • 3
  • 29
  • 37
2

For example you have a start program named run.sh to start it working at background do the following command line. ./run.sh &>/dev/null &

jokermt235
  • 478
  • 6
  • 9