0

I have several bunch of linux commands contained in a text file, separated by \n\n, and I would like to automatically paste each in given screen processes. For the sake of clarity, let's say my command.txt simply contains :

#first bunch of commands:    
executable_script1.sh
mv executable_script1 directory1


#second bunch of commands:    
executable_script2.sh
mv executable_script2 directory2

So the first bunch of commands would run executable_script1.sh, after what it will move executable_script1. In this example, my screen contains 3 processes:

0$ htop
1$ bash
2$ bash

The name of the processes is irrelevant, the only important information is that I would like commands N in screen process N$, as 0$ is always a htop.

As for now, I have been copying/pasting manually each bunch of commands in the corresponding screen processes, which worked obviously, but now I will be dealing with more than 40 bunch of commands and as many screen processes. So, how could I paste the commands N to the N$ screen terminal automatically? I think a bash/shell script could do the trick, but I am not fluent enough with it. I currently use a python2 script to generate my command.txt file, so mind that I could create one txt file by bunch of commands pretty easily if needed.

Could you help me with this? Please feel free to ask for any missing information.

PS: I also asked this question on Unix Stackexchange, but this forum seems far less populated... If we find an answer here, I will invite the answerer to paste it under my Unix Stackexchange question as well, as this could help others!

jeannej
  • 1,135
  • 1
  • 9
  • 23
  • It sounds like you really need a job scheculer tool. I think you'll better spend your time looking for ways to move away from screen, as that is only giving you a way to run a command in parallel with others (yes, and display the output to a screen). but why not just save that output to a log file, which gives you many more options for monitoring and reporting on what your "system" is doing. Just IMHO . Good luck! – shellter Aug 30 '19 at 23:18
  • thank you @shellter for your insight! The thing is that I want to be able to manage (lauch, check and stop) my scripts from home as well. I have been using screen to this extent for one year or so and it fits perfectly this need. I am running calculation so the log file would be written only at the end I think. Or did you have anything else in mind? – jeannej Sep 01 '19 at 19:12
  • 1
    Yes, I have used `screen` as a base for letting processes run and then checking in on them from home. But I can't imagine trying to do that with 40 projects. For you, the next step is to put things into a crontab file. A crontab enry like `59 11 1-5 * * { /path/to/script1 -f cfgFile -s[ample args...] /path/to/input/data/script1Input/* > /path/to/reports/script1/report.txt 2> /path/to/logs/script1.errslog.$(/bin/date +\%Y\%m\%d.\%H\%M) ;}` wold allow you to control your input, output and error messages. You can allways login and look in your `report` or `errslog` file to see what is happening. – shellter Sep 01 '19 at 19:55
  • 1
    You can use the `less` utility for that. Read about all of its options.You can keep track of your crontab file using an sccs system (`git`?) and import and export the file to your `sccs` easily. `crontab -e` lets you "edit/create" your crontab per user. `crontab -l > /path/to/sccs/crontab.ver1.1` lets you save the file so you can edit it there separately, or just check it in. – shellter Sep 01 '19 at 19:58
  • Depending on your $$ resources, you might want to skip this step and learn how to use a real job scheduler. `autosys` is one such product. They are usually expensive. There must be an opensource scheduler, but I can't offer any names or opinions. You can do incredibly complex control of process with these. tools. If you are on a zero budget and just want to get going, then there are many Q/A here about `crontab` entries so read up on them. And check for local documentation `info crontab` or `man crontab` on your local system will show you what options are available in your local system. – shellter Sep 01 '19 at 20:04
  • 1
    To manage/stop a script, I would just rely on `kill -15 $jobPID` . You can designate a special file to hold the PID value for a running job. But you'll need to understand how many levels of processes are created for your jobs. Look at the output of `ptree` to see what I mean. You can't just blindly take the first $PID value you find to do a clean stop. OK, thats it for now. Realize that this sort of general conversation about tools is really off-topic here as a S.O. q. You might get some traction on https://chat.stackoverflow.com/rooms/98569/bin-bash ... – shellter Sep 01 '19 at 20:09
  • but answering your specific needs is really a full time job, ... yours! ;-) So good luck. and post focused Qs including sample inputs (small!), required output, your current code/output/errorMsgs and then you'll be on-topic and get a lot of help. – shellter Sep 01 '19 at 20:10
  • thank you for taking the time to explain further you point @shellter. I am willing to use a self made scripts only, so I will dig a bit deaper about `crontab`. Thanks again for the hint, and I will post more specific questions about the topic if needed! – jeannej Sep 02 '19 at 01:30
  • 1
    See https://stackoverflow.com/tags/cron/info for a pretty comprehensive write up on crontab. You can always post a new Q if you get stuck. Be sure to include the smallest sample of code (crontab entry) that causes the problem. Best to include your OS information, `uname -srv` output will do. Redirect your cron entry output to a log file, i.e. `01 01 01 01 * * VAR=true /path/to/myScript args > /tmp/myScript.log 2>&1` (VAR=true is just to illustrate you can set an envVar in myScript by pre-pending it to the command). Good luck. – shellter Sep 05 '19 at 15:44

1 Answers1

1

I finally found my answer, thanks to this post! Sometimes it just takes some other keywords to find a solution, so I will answer this question in case some other fellows end up here.

In a nutshell

Automatically paste commands in the screen with the bash command :

screen -x screen_name -p 1 -X stuff 'executable_script1.sh\n'

where -p 1 refers to the 1$ screen process. Note that \n at the end of the command is necessary, as when you press enter after pasting a command line.

Detailed steps

1) Create the screen session you want to work in (here named 'screen_name') :

screen -S screen_name

with enough processes for all the commands (in my example, 0$ htop plus 2 processes : 1$ and 2$). Note that you can edit .screenrc in your home directory so that screen sessions start with a given number of processes by default. For this example, my .screenrc contains :

screen -t htop
screen -t 
screen -t 

2) Create bash files for every bunch of commands, to be executed by the different screen processes.

Here I have 2 files, screen1 containing :

#!/bin/bash

screen -x screen_name -p 1 -X stuff 'executable_script1.sh\n'
screen -x screen_name -p 1 -X stuff 'mv executable_script1 directory1\n'

and screen2 containing :

#!/bin/bash

screen -x screen_name -p 2 -X stuff 'executable_script2.sh\n'
screen -x screen_name -p 2 -X stuff 'mv executable_script2 directory2\n'

3) Paste all your commands at once in a terminal, with :

bash /path_to_screen1/screen1 & /path_to_screen2/screen2 &

You can close this terminal immediately, even if you have long run calculations, as all it does is paste the command into screen. Manually open your screen session to verify that these lines are being executed.

Needless to say, if you have a great number of commands to pass to many screen processes, you can create the bash files and paste the commands (steps 2 and 3) via a script (with pythonfor instance). Also executable_script1.sh can contain python calls if needed, with python python_script.py, as in a normal terminal.

Hope this will help others!

jeannej
  • 1,135
  • 1
  • 9
  • 23