0

Im trying to get text output of specified command, modify it somehow (e.g. add prefix before output) and print into file (.txt or .log)

LOG_FILE=...
LOG_ERROR_FILE=..
command_name >> ${LOG_FILE} 2>> ${LOG_ERROR_FILE}

I would like to do it in one line to modify what command will return and print it into files. The same situation for error output and regular output.

Im beginner in bash scripts, so please be understading.

treekt
  • 147
  • 1
  • 12

3 Answers3

1

Create a function to execute commands and capture sterr an stdout to variables.

function execCommand(){
  local command="$@"
  {
    IFS=$'\n' read -r -d '' STDERR;
    IFS=$'\n' read -r -d '' STDOUT;
  } < <((printf '\0%s\0' "$($command)" 1>&2) 2>&1)
}

function testCommand(){
    grep foo bar
    echo "return code $?"
}

execCommand testCommand
echo err: $STDERR
echo out: $STDOUT

execCommand "touch /etc/foo"
echo err: $STDERR
echo out: $STDOUT

execCommand "date"
echo err: $STDERR
echo out: $STDOUT

output

err: grep: bar: No such file or directory
out: return code 2
err: touch: cannot touch '/etc/foo': Permission denied
out:
err:
out: Mon Jan 31 16:29:51 CET 2022

Now you can modify $STDERR & $STDOUT

execCommand testCommand &&  { echo "$STDERR" > err.log; echo "$STDOUT" > out.log; }

Explanation: Look at the answer from madmurphy

ufopilot
  • 3,269
  • 2
  • 10
  • 12
0

Pipe | and/or redirects > is the answer, it seems.

So, as a bogus example to show what I mean: to get all interfaces that the command ip a spits out, you could pipe that to the processing commands and do output redirection into a file.

ip a | awk -F': *' '/^[0-9]/ { print $2 }' > my_file.txt

If you wish to send it to separate processing, you could redirect into a sub-shell:

$ command -V cd curl bogus > >(awk '{print $NF}' > stdout.txt) 2> >(sed 's/.*\s\(\w\+\):/\1/' > stderr.txt)
$ cat stdout.txt 
builtin
(/usr/bin/curl)
$ cat stderr.txt 
bogus not found

But it might be better for readability to process in a separate step:

$ command -V cd curl bogus >stdout.txt 2>stderr.txt
$ sed -i 's/.*\s//' stdout.txt
$ sed -i 's/.*\s\(\w\+\):/\1/' stderr.txt
$ cat stdout.txt 
builtin
(/usr/bin/curl)
$ cat stderr.txt 
bogus not found

There are a myriad of ways to do what you ask and I guess situation will have to decide what to use, but here's a start.

Kaffe Myers
  • 424
  • 3
  • 9
0

To modify the output and write it to a file, while modifying the error stream differently and writing to a different file, you just need to manipulate the file descriptors appropriately. eg:

#!/bin/sh
# A command that writes trivial data to both stdout and stderr
cmd() {
        echo 'Hello stdout!'
        echo 'Hello stderr!' >&2
}


# Filter both streams and redirect to different files
{ cmd 2>&1 1>&3 | sed 's/stderr/cruel world/' > "$LOG_ERROR_FILE"; } 3>&1 |
     sed 's/stdout/world/' > "$LOG_FILE"

The technique is to redirect the error stream to the stdout so it can flow into the pipe (2>&1), and then redirect the output stream to a ancillary file descriptor, which is being redirected into a different pipe.

You can clean it up a bit by moving the file redirections into an earlier exec call. eg:

#!/bin/sh
cmd() {
        echo 'Hello stdout!'
        echo 'Hello stderr!' >&2
}

exec > "$LOG_FILE"
exec 2> "$LOG_ERROR_FILE"

# Filter both streams and redirect to different files
{ cmd 2>&1 1>&3 | sed 's/stderr/cruel world/' >&2; } 3>&1 | sed 's/stdout/world/'
William Pursell
  • 204,365
  • 48
  • 270
  • 300