0

I have a list of scripts doing their own thing (they are actually Rscripts reading modifying and writing files) like this:

## Script 1
echo "1" > file1.out
## Script 2
echo "2" > file2.out
## Script 3
echo "3" > file3.out

These are saved in different scripts as follow:

## Writing script 1
echo "echo \"1\" > file1.out" > script1.task
## Writing script 2
echo "echo \"2\" > file2.out" > script2.task
## Writing script 3
echo "echo \"3\" > file3.out" > script3.task

Is there a way to run all these scripts in parallel using the file names? In a loop it'd look like this:

for task_file in *.task
do
  sh ${task_file}
done
Thomas Guillerme
  • 1,747
  • 4
  • 16
  • 23
  • 1
    Ampersand is your friend. See here https://stackoverflow.com/questions/3004811/how-do-you-run-multiple-programs-in-parallel-from-a-bash-script?rq=1 – Andre Wildberg Nov 29 '22 at 12:40
  • You can run the scripts in the background, and after the loop use `wait` to wait until all are finished. However, how many processes are you planning to run in parallel? Are you sure that your machine can handle them? – user1934428 Nov 29 '22 at 13:00
  • 1
    Thanks both, I will use the `&` and `wait` design. The number of tasks generated is function of the number of cores available so the pipeline is designed to be portable on any machine (a one core one will just end up being a regular for loop). – Thomas Guillerme Nov 29 '22 at 13:46

2 Answers2

1

If you only have 3 the answer is to use &.

But if you have 1000s:

seq 10000 | parallel 'echo {} > file{}.out'
Ole Tange
  • 31,768
  • 5
  • 86
  • 104
0

Following advice from Andre Wildberg and user1934428 here's a solution using & just for the record:

for task_file in *.task
do
  sh ${task_file} &
done
wait
Thomas Guillerme
  • 1,747
  • 4
  • 16
  • 23
  • 1
    Put the `wait` outside the loop, otherwise it just does the same as without the `&`. E.g. `for i in {1..5};do sleep 2 & echo "OK"; wait; done` vs `for i in {1..5};do sleep 2 & echo "OK"; done; wait` – Andre Wildberg Nov 29 '22 at 15:26