2

I'm having a shell script as follows.

#!/bin/bash

myfunc() {
   #do something (call a rest service)
   sleep 300
   status=$(get status of the operation performed above)
   while [ "$status" != "succeeded" ]; do
       #do something (call a rest service)
       sleep 300
       status=$(get status of the operation performed above)
   done
}

a=0

while [ $a -lt 1000 ]
do
   echo "Starting myfunc with process : $a"
   a=`expr $a + 1`
   myfunc
   echo "Finished myfunc with process : $a"
done

In best case, above script takes at least 5*1000=5000 seconds to complete running.

Is there a way to make the myfunc call parallel so that we can let the while loop to spawn multiple running instances of the myfunc??

In addition to that I want following conditions to be satisfied.

  1. This script should wait until all instances of myfunc executions are completed.
  2. The output of above should be like below (it should preserve the echo order).
    Starting myfunc with process : x
    Finished myfunc with process : x
    Starting myfunc with process : y
    Finished myfunc with process : y
    Starting myfunc with process : k
    Finished myfunc with process : k

It should not like below.

    Starting myfunc with process : x
    Starting myfunc with process : k
    Finished myfunc with process : k
    Starting myfunc with process : z
    Finished myfunc with process : x
    Finished myfunc with process : z
Vajira Prabuddhaka
  • 852
  • 4
  • 13
  • 34
  • Does this answer your question? [Forking / Multi-Threaded Processes | Bash](https://stackoverflow.com/questions/1455695/forking-multi-threaded-processes-bash) – rsmeral Jun 10 '21 at 14:44
  • 1
    GNU Parallel does exactly this: Postpone the output until the job is done. Move the `echo` inside `myfunc`, `export -f myfunc`, run `parallel myfunc ::: {1..1000}` – Ole Tange Jun 13 '21 at 10:07

0 Answers0