0

I have a list of (bash) commands I want to run:

<Command 1>
<Command 2>
...
<Command n>

Each command takes a long time to run, and sometimes after seeing the output of (e.g.) <Command 1>, I'd like to update a parameter of <Command 5>, or add a new <Command k> at an arbitrary position in the list. But I want to be able to walk away from my machine at any time, and have it keep working through my last update to the list.

This is similar to the question here: Edit shell script while it's running. Some of those answers could be made to serve, but that question had the additional constraint of wanting to edit the script file itself, and I suspect there is a simpler answer because I don't have that exact constraint.

My current solution is to end my script with a call to a second script. I can edit the second file while the first one runs, this lets me append new commands to the end of my list, but I can't make any changes to the list of commands in the first file. And once execution has started in the second file, I can't make any more changes. But I often stop my script to insert updates, and this sometimes means stopping a long command that is almost complete, only so that I can update later items on the list before I leave my machine for a time. I could of course chain together many files in this way, but that seems a mess for what (hopefully) has a simple solution.

Community
  • 1
  • 1
robm
  • 1,051
  • 2
  • 8
  • 15
  • So you want to use branches without actually using branches? I suggest you just add more logic to your script so it can handle all desired situations. It looks like your having an XY problem. – fanton Aug 22 '16 at 07:31

2 Answers2

2

This is more of a conceptual answer than one where I provide the full code. My idea would be to run Redis (Redis description here) - it is pretty simple to install - and use it as a data-structure server. In your case, the data structure would be a list of jobs.

So, you basically add each job to a Redis list which you can do using LPUSH at the command-line:

echo "lpush jobs job1" | redis-cli

You can then start one, or more, workers, in parallel if you wish and they sit in a loop, doing repeated BLPOP of jobs (blocking pop, waiting till there are jobs) off the list and processing them:

#!/bin/bash
while :; do
    job=$(echo brpop jobs 0 | redis_cli)
    do $job
done

And then you are at liberty to modify the list while the worker(s) is/are running using deletions and insertions.

Example here.

Community
  • 1
  • 1
Mark Setchell
  • 191,897
  • 31
  • 273
  • 432
  • I like this idea. It could be adapted to any queue system, including a simple homegrown one which keeps the queue in a set of numbered files. – tripleee Aug 22 '16 at 09:37
0

I would say to put each command that you want to run in a file and in the main file list all of the command files

ex: main.sh

#!/bin/bash

# Here you define the absolute path of your script

scriptPath="/home/script/"

 # Name of your script

 scriptCommand1="command_1.sh"
 scriptCommand2="command_2.sh"
 ...
 scriptCommandN="command_N.sh"

 # Here you execute your script

 $scriptPath/$scriptCommand1
 $scriptPath/$scriptCommand2
 ...
 $scriptPath/$scriptCommandN

I suppose while 1 is running you can then modify the other since they are external files

Tarek
  • 3,810
  • 3
  • 36
  • 62