20

I have a C shell script that does something like this:

#!/bin/csh
gcc example.c -o ex
gcc combine.c -o combine
ex file1 r1     <-- 1
ex file2 r2     <-- 2
ex file3 r3     <-- 3
#... many more like the above
combine r1 r2 r3 final
\rm r1 r2 r3

Is there some way I can make lines 1, 2 and 3 run in parallel instead of one after the another?

Lazer
  • 90,700
  • 113
  • 281
  • 364

7 Answers7

13

Convert this into a Makefile with proper dependencies. Then you can use make -j to have Make run everything possible in parallel.

Note that all the indents in a Makefile must be TABs. TAB shows Make where the commands to run are.

Also note that this Makefile is now using GNU Make extensions (the wildcard and subst functions).

It might look like this:

export PATH := .:${PATH}

FILES=$(wildcard file*)
RFILES=$(subst file,r,${FILES})

final: combine ${RFILES}
    combine ${RFILES} final
    rm ${RFILES}

ex: example.c

combine: combine.c

r%: file% ex
    ex $< $@
Zan Lynx
  • 53,022
  • 10
  • 79
  • 131
  • Your `-l combine` should be `-o combine` – SiegeX May 07 '10 at 19:39
  • Would you put the `final` rule first to make it the default? – glenn jackman May 07 '10 at 19:41
  • I like your answer but I just realized his question states more than just 3 possible files and this doesn't seem to scale all that well. – SiegeX May 07 '10 at 23:43
  • Beware the habit of running "make -j" without the integer argument. It will keep spawning as fast as it can. This can cripple a machine during a build with a lot of source files. A better habit is something like "make -j8" – Mark Borgerding May 10 '10 at 19:16
12

In bash I would do;

ex file1 r1  &
ex file2 r2  &
ex file3 r3  &
wait
... continue with script...

and spawn them out to run in parallel. You can check out this SO thread for another example.

Community
  • 1
  • 1
AlG
  • 14,697
  • 4
  • 41
  • 54
5
#!/bin/bash

gcc example.c -o ex
gcc combine.c -o combine

# Call 'ex' 3 times in "parallel"
for i in {1..3}; do
  ex file${i} r${i} &
done

#Wait for all background processes to finish
wait

# Combine & remove
combine r1 r2 r3 final
rm r1 r2 r3

I slightly altered the code to use brace expansion {1..3} rather than hard code the numbers since I just realized you said there are many more files than just 3. Brace expansion makes scaling to larger numbers trivial by replacing the '3' inside the braces to whatever number you need.

SiegeX
  • 135,741
  • 24
  • 144
  • 154
3

you can use cmd & and wait after

#!/bin/csh
echo start
sleep 1 &
sleep 1 &
sleep 1 &
wait
echo ok

test:

$ time ./csh.sh 
start
[1] 11535
[2] 11536
[3] 11537
[3]    Done                   sleep 1
[2]  - Done                   sleep 1
[1]  + Done                   sleep 1
ok

real    0m1.008s
user    0m0.004s
sys 0m0.008s
Oleg Razgulyaev
  • 5,757
  • 4
  • 28
  • 28
1

GNU Parallel would make it pretty like:

seq 1 3 | parallel ex file{} r{}

Depending on how 'ex' and 'combine' work you can even do:

seq 1 3 | parallel ex file{} | combine

Learn more about GNU Parallel by watching http://www.youtube.com/watch?v=LlXDtd_pRaY

Ole Tange
  • 31,768
  • 5
  • 86
  • 104
0

xargs can do it:

seq 1 3 | xargs -n 1 -P 0 -I % ex file% r%

-n 1 is for "one line per input", -P is for "run each line in parallel"

avim
  • 979
  • 10
  • 25
0

You could use nohup ex :

nohup ex file1 r1 &    
nohup ex file2 r2 &
nohup ex file3 r3 &
ant
  • 22,634
  • 36
  • 132
  • 182
  • `nohup` is massive overkill here. Its goal is to let a process continue running even if the terminal the shell is running in is closed, which isn't what the OP is asking for here. Moreover, you don't need it even for that -- the shell's builtin `disown` command can do the parts of that that can't be done just by redirecting stdin/stdout/stderr away from the TTY. And further, `nohup` prevents the shell from finding out if any of the `ex` commands failed after-the-fact, which the OP here presumably *does* want. – Charles Duffy Apr 13 '18 at 16:21