0

We can do this to save the output into variable

x=$(curl -s http://www.example.com)
y=$(curl -s http://www.example.com)

We can do this to make http requests concurrently

curl -s http://www.example.com &
pid1=$!
curl -s http://www.example.com &
pid2=$!
wait $pid1 || echo failed
wait $pid2 || echo failed

But how to combine them?

My requirement is

  1. Run request concurrently
  2. Save response into several variables without create a temporary file
  3. Get exit code of each request
  4. Using shell script

Is it possible on shell script? It'll be much more easier if I can use python with library like aiohttp... But no

haudoing
  • 603
  • 7
  • 18
  • I'd probably opt for a couple arrays to store **a)** responses and **b)** exit codes; I'd then put the `curl` call in a function which takes as arguments the URL and an instance id; the instance id would be used as the index into the 2 arrays; the main process would call the function (in this case) twice with each invocation being put in the background; then `wait` for both background processes to complete; then query the arrays for your responses and exit codes – markp-fuso Dec 16 '20 at 18:35
  • @markp-fuso would you mind to write your solution? I'd like to know an elegant way to capture multiple background output into an array with pid – haudoing Dec 17 '20 at 04:45
  • apologies, I got myself mixed up with the 2 examples; kicking a process off in the background (obviously) spawns a new process with its own memory space, which in turn cannot be shared with the calling/parent process's memory space (ie, can't share arrays between the 2 separate processes); if you're not allowed to create an intermediate file to save the `curl` results you may want to see if [this Q&A](https://stackoverflow.com/q/20017805) could be used (this sounds like a homework question so not sure if you've covered `coproc` and/or `process substitution` at this point) – markp-fuso Dec 17 '20 at 15:37
  • I'm also assuming `without creating a temporary file` means you're not allowed to use named pipes – markp-fuso Dec 17 '20 at 16:45
  • I saw the stackoverflow post you mentioned. The problem of coproc is it doesn't support concurrent. Actually I'll need to handle more than 2 requests at a time. The most resource effeciency and maintainable way might be something like python asyncIO library or goroutine. But I hope I don't have to install extra library. – haudoing Dec 18 '20 at 10:04

1 Answers1

0

The only issue you don't have a solution to is reading the output of background processes into variables. This can be achieved by using exec ensuring that the output of the command returns to the existing shell as opposed to a child shell:

y=$(exec curl -s http://www.example.com &)
Raman Sailopal
  • 12,320
  • 2
  • 11
  • 18