3

I have stucked with a bash scipt which should write both to stdout and into file. I'm using functions and some variables inside them. Whenever I try to redirect the function to a file and print on the screen with tee I can't use the variables that I used in function, so they become local somehow. Here is simple example:

#!/bin/bash
LOGV=/root/log

function var()
{
echo -e "Please, insert VAR value:\n"
read -re VAR
}
var 2>&1 | tee $LOGV
echo "This is VAR:$VAR"

Output:

[root@testbox ~]# ./var.sh   
Please, insert VAR value:

foo
This is VAR:
[root@testbox ~]#

Thanks in advance!

EDIT: Responding on @Etan Reisner suggestion to use var 2>&1 > >(tee $LOGV)

The only problem of this construction is that log file dosn't receive everything...

[root@testbox~]# ./var.sh
Please, insert VAR value: 

foo 
This is VAR:foo
[root@testbox ~]# cat log 
Please, insert VAR value:
LinenG
  • 109
  • 3
  • 14
  • 2
    I think your problem is that since you're using a pipe (into `tee`), the initial invocation of `var` is happening in a subshell. So you set an environment variable in that subprocess, but doesn't affect the environment of the main (parent) process. – Steve Summit Jul 21 '15 at 23:09
  • Do you want the prompt "Please, insert..." to to to the output file? – William Pursell Jul 21 '15 at 23:15
  • Thanks, yes I guess this is the subshell issue, but can't fix it yet. William Pursell, yes, I do! – LinenG Jul 21 '15 at 23:19
  • 1
    What is your ultimate goal here? You need to avoid variable assignment in sub-shells if you want them to affect the main shell. If the post snippet is your ultimate goal then your only solution (as far as I know) is to replace the pipe with something like `> >(tee $LOGV)` to use process substitution and output redirection instead of the pipe. – Etan Reisner Jul 21 '15 at 23:34
  • Amazing, Etan! That's what I need, yes post snippet. It works like a charm. Please, post full answer in order I could accept it! Thanks a lot, have spent few hours on this ... – LinenG Jul 21 '15 at 23:43
  • As an aside -- `tee $LOGV` is not safe: If `LOGV` was a filename with spaces, `tee` would then write to more than one file. Always quote your expansions: `tee "$LOGV"`. Also, all-caps variable names are reserved by convention for system use; try to be in the habit of using at least one lower-case letter in your own variables' names, to prevent overriding a system variable by mistake. (See fourth paragraph of http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap08.html, keeping in mind that shell and environment variables share a namespace). – Charles Duffy Jul 22 '15 at 01:25
  • @Charles Duffy, thank you for recommendations, will keep in mind! – LinenG Jul 22 '15 at 12:57
  • Re: "log doesn't receive everything" -- it doesn't receive local echo of what the user typed, because that's *local echo* -- it's not on stdout or stderr, but purely a terminal construct. – Charles Duffy Jul 22 '15 at 15:09
  • As a terminology note -- they're not "local" as such; the variables in question are global to the shell they're running in; the problem is that the shell they're running in is a subshell that exits when the pipeline is finished. For this reason, you couldn't just use, say, `declare -g` to force them to be global and have that provide any useful effect. – Charles Duffy Jul 22 '15 at 15:17

1 Answers1

7

This is a variant of BashFAQ #24.

var 2>&1 | tee $LOGV

...like any shell pipeline, has the option to run the function var inside a subprocess -- and, in practice, behaves this way in bash. (The POSIX sh specification leaves the details of which pipeline components, if any, run inside the parent shell undefined).


Avoiding this is as simple as not using a pipeline.

var > >(tee "$LOGV") 2>&1

...uses process substitution (a ksh extension adopted by bash, not present in POSIX sh) to represent the tee subprocess through a filename (in the form /dev/fd/## on modern Linux) which output can be redirected to without moving the function into a pipeline.


If you want to ensure that tee exits before other commands run, use a lock:

#!/bin/bash
logv=/tmp/log

collect_var() {
        echo "value for var:"
        read -re var
}
collect_var > >(logv="$logv" flock "$logv" -c 'exec tee "$logv"') 2>&1
flock "$logv" -c true # wait for tee to exit

echo "This is var: $var"

Incidentally, if you want to run multiple commands with their output being piped in this way, you should invoke the tee only once, and feed into it as appropriate:

#!/bin/bash
logv=/tmp/log
collect_var() { echo "value for var:"; read -re var; }

exec 3> >(logv="$logv" flock "$logv" -c 'exec tee "$logv"') # open output to log
collect_var >&3 2>&3         # run function, sending stdout/stderr to log
echo "This is var: $var" >&3 # ...and optionally run other commands the same way
exec 3>&-                    # close output
flock "$logv" -c true        # ...and wait for tee to finish flushing and exit.
Charles Duffy
  • 280,126
  • 43
  • 390
  • 441
  • Thank you for reply, but there are some obstacles while using the methods you've recommended: 1) During first variant the log file doesn't receive a final echo part + it is not appending (easy manageable by `tee -a`, though); 2) Here the some unexpected behavior happens: 1) inserted value foo is printed with echo line, like this: `fooThis is VAR:foo`; 2) Echo is not redirected to file; 3) The script doesn't exit after echoing `fooThis is VAR:foo`, though, it prompts for input: `[root@testbox ~]# ` – LinenG Jul 22 '15 at 13:19
  • I don't think I follow what you're trying to accomplish well enough for the objections above to make sense. Could you extend your question to describe *exactly* what contents you expect in the log file after the command is run, and *exactly* what behavior or output leads you to believe that the script isn't exiting when it should? (As for (1), your original pipe-based code didn't append to logv either, so I don't follow why it's a problem that this doesn't). – Charles Duffy Jul 22 '15 at 13:40
  • Charles Duffy, the final goal is to have function with defined global variables inside that may be used outside of the function while redirecting stdout/stderr of script execution to file and screen. P.S. You're right, my bad, I didn't mention about appending, sorry. – LinenG Jul 22 '15 at 14:39
  • I've added a full end-to-end example that demonstrates using locking to ensure that `tee` exits before other parts of the command are run. – Charles Duffy Jul 22 '15 at 15:08
  • Charles Duffy, thanks for full example it is much more understandable now. But, I still have one question to ask... What if I'd like to ensure `tee` exits **after** the execution of commands, how should I use this construct (have tried back and forth, can't figure out). Thanks! – LinenG Jul 23 '15 at 09:56
  • "the execution of commands", meaning the execution of those commands that feed into the `tee`? Because that's already guaranteed -- you can't put the `flock` until after the commands that feed into the tee as a matter of course. `tee` won't ever exit until after its stdin feed has been closed; you don't need to worry about it exiting early -- the only case you need to worry about is it exiting *late*. – Charles Duffy Jul 23 '15 at 13:30
  • ...now, if you want "commands", plural, to feed into the `tee`, then you want to use the former exec form. I'll update to show that combined with the locking. – Charles Duffy Jul 23 '15 at 13:32
  • By the way, if your `echo "This is var: $var"` *also* fed into the tee, ie. was in the `>&3` section before the FD was closed with the multi-line form, then you wouldn't have the trouble with it being out-of-sync. – Charles Duffy Jul 23 '15 at 13:36
  • Charles, that was exactly what I was trying to accomplish! Thank you for all your answers and extended examples! – LinenG Jul 23 '15 at 16:07