0

I have a small workflow formed by one run.sh script that automates a few scripts.

In this run.sh I have a function to run some INFO and ERROR messages and save it in a log.txt file but the scripts also generate outputs. I would like to save these outputs in another log file and also see the output when I run the pipeline.

I run my pipeline with this command.

run.sh -f1 a file -f2 other file -d a/directory.

I have seen I can do that as explained in this link

But as long as I know this will not show me the output in the terminal.

How can I get output in the terminal and also save it in a file? I am using a cluster computer and the output in the terminal is not saved if I lost the conection or log off from my PC.

1 Answers1

2

You know about tee? Something like...

run.sh -f1 a file -f2 other file -d a/directory | tee output.txt

...would run your script and show the standard output, while at the same time store it in output.txt

Taqras
  • 173
  • 1
  • 10
  • if the directory I want to save the output.txt file is not created until the run starts, what can I do? – Manolo Dominguez Becerra Jan 24 '22 at 18:25
  • Save to a temporary file, or manually create the directory before the run. There is no way to create a directory simply by redirecting. – tripleee Jan 25 '22 at 06:48
  • I'd just store it somewhere else - it's a log of what happens, so make a folder for those logs and name the file with the date and time instead of output.txt (e.g. ... | tee $(date +"%Y-%m-%d_%H:%M:%S").txt). Maybe make a function that takes the two files and the directory as parameters and then do that command? – Taqras Jan 25 '22 at 07:15