2

I am trying to execute a curl for 300 at a same time and using array. I do not know how to bring the content of my file in array. The code I write is bellow.

array=();
for i in {1..300}; do
  array+=( file.txt ) ; 
done; 
curl "${array[@]}";

The file.text include the following code

  --next 'https://d16.server.com/easy/api/OmsOrder' -H 'Connection: keep- 
  alive' - H 'Pragma: no-cache' -H 'Cache-Control: no-cache' -H 'Accept: 
  application/json, 
  text/plain, */*' -H 'Sec-Fetch-Dest: empty' -H 'User-Agent: Mozilla/5.0 
 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) 
  Chrome/80.0.3987.132 Safari/537.36' -H 'Content-Type: application/json' -H 
 'Origin: https://d.server.com' -H 'Sec-Fetch-Site: same-site' -H 
 'Sec-Fetch-Mode: cors' -H 'Referer: https://d.server.com/' -H 'Accept- 
  Language: en-US,en;q=0.9,fa;q=0.8' --data-binary 
 '{"isin":"IRO3TPEZ0001","financeId":1,"quantity":50000,"price":5400}' -- 
  compressed"
Andrea
  • 39
  • 5
  • What's in "file.txt"? URLs? Do you want to pass the first 300 lines in the file to curl and ignore the rest? – that other guy Mar 17 '20 at 19:13
  • @thatotherguy file.txt include a JSON staff and it is too long. I would like post it to the server 300 times. when I bring the content of file here I will get this error "-bash: /usr/bin/curl: Argument list too long" – Andrea Mar 17 '20 at 19:19
  • There is no server in your example so this can't work. You should figure out a complete command for posting to a server once before trying to put anything in a loop or in an array. – that other guy Mar 17 '20 at 19:22
  • Your script does this: `curl file.txt file.txt file.txt ...`, with 300 times `file.txt`. – Benjamin W. Mar 17 '20 at 19:22
  • @thatotherguy I have put the staff in file.txt. The file text include server and header and data. – Andrea Mar 17 '20 at 19:31
  • You need a different approach depending on the exact format of the file. Can you please include an excerpt of it? – that other guy Mar 17 '20 at 19:34
  • @thatotherguy Please see edited post I put the inside of file in post. – Andrea Mar 17 '20 at 19:41
  • Read up on the `mapfile` bash built-in command. – Shawn Mar 17 '20 at 19:41
  • http://mywiki.wooledge.org/BashFAQ/005#Loading_lines_from_a_file_or_stream – Shawn Mar 17 '20 at 19:44

3 Answers3

1
array=();
for i in {1..300}; do
  array+=( $(cat file.txt|head -$i | tail -1) );
done; 
curl "${array[@]}";
M Imam Pratama
  • 998
  • 11
  • 26
  • Could be improved a bit with sed -n "${i}p" Also, surely there's a way that isn't O(n^2)? Perhaps https://stackoverflow.com/questions/11393817/read-lines-from-a-file-into-a-bash-array ? – dstromberg Mar 17 '20 at 19:48
1

You have a file with shell formatted words that you are trying to repeat over and over in a command.

Since the words are shell formatted, you'll need to interpret them using e.g. eval:

contents=$(< file.txt)
eval "words=( $contents )"
arguments=()
for i in {1..300}
do
  arguments+=( "${words[@]}" )
done
curl "${arguments[@]}"

A more robust design would be to not use shell quoting and instead format one argument per line:

--next
https://d16.server.com/easy/api/OmsOrder
-H
Connection: keep-alive
-H
Pragma: no-cache

You can then use the above code and replace the eval line with:

mapfile -t words < file.txt
that other guy
  • 116,971
  • 11
  • 170
  • 194
0

The answer to this question should have been "put each request into a file, one option per line, and use -K/--config to include the file into the command line." That certainly should allow for 300 requests in a single curl command without exceeding the limit on the size of a shell command. (By "request" here, I mean "a URL with associated options". If you only want to use 300 URLs without modifying any other option, you can easily do that by just listing the URLs, on the command line if they aren't too long or otherwise in a file.)

Unfortunately, it doesn't work. I believe that it is supposed to work, and the fact that it doesn't is a bug. If you specify multiple -K options and each of them refers to a file which includes one request and the --next option, then curl will execute only the first and last file. If you instead put the --next options on the command-line in between the -K options, all the request options will be merged, and in addition curl will complain about a missing URL.

However, you can use the -K option by concatenating all 300 requests and passing them through stdin, using -K - to read from stdin. To test that, I created the file containing a single request:

$ cat post-req
--next
-H "Connection: keep-alive"
-H "Pragma: no-cache" 
-H "Cache-Control: no-cache" 
-H "Accept: application/json, text/plain, */*" 
-H "Sec-Fetch-Dest: empty" 
-H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36" 
-H "Content-Type: application/json" 
-H "Origin: https://d.server.com" 
-H "Sec-Fetch-Site: same-site" 
-H "Sec-Fetch-Mode: cors" 
-H "Referer: https://d.server.com/" 
-H "Accept-Language: en-US,en;q=0.9,fa;q=0.8" 
--data-binary "{\"isin\":\"IRO3TPEZ0001\",\"financeId\":1,\"quantity\":50000,\"price\":5400}" 
--compressed
--url "http://localhost/foo"

and then set up a little webserver that just returns the requested path, and invoked curl with:

for i in $(seq 300); do cat post-req; done | curl -K -

Indeed, all three hundred requests are passed through.

For what it's worth, I reported the bug as https://github.com/curl/curl/issues/5120, and many thanks to Daniel Stenberg for being incredibly responsive by committing a fix in less than two days. So probably the issue will be resolved in the next curl release.

rici
  • 234,347
  • 28
  • 237
  • 341
  • Thanks for the answer but when I run the `for i in $(seq 300); do cat post-req; done | curl -K -` to the terminal will get the error like `Warning: :1: warning: '$' is unknown Warning: :3: warning: ''https' is unknown` I will be grateful if you let me know why? And one more thing how can I speed up these requests and make more requests? – Andrea Mar 19 '20 at 18:38
  • @andrea: the line `$cat post-req` is me typing a bash command which shows you the file. You're not supposed to put it into the file. The error about https is because you didn't follow the model in my file; in a config file the `--url` option is obligatory. You can't leave it out like you can on the command-line. See Curl's help for more details. – rici Mar 19 '20 at 19:11
  • I have done it correctly and it works very well. Just wondering how can speed up this and post 1000 in 1 seconds? Is there any way to do this indeed my computer send 300 request in 9 seconds. – Andrea Mar 19 '20 at 19:18
  • For your other question, you could take a look at the `--parallel` option if your version of curl is recent enough. (You need 7.66.0, I believe.) Please try to avoid violating the terms of use of the webserver if it is not yours. – rici Mar 19 '20 at 19:19
  • Just test the Bash in Terminal but It only send 175 and get their response. It suppose to post 300 request and get 300 responses, what is going wrong? – Andrea Mar 20 '20 at 15:33
  • @andrea: perhaps the remote server is blocking you :-) Honestly, I have no idea what's going on as I cannot see over your shoulder, but it seems to me more likely to be a networking or server issue than a problem with curl. I suggest that you post a new question with enough detail that someone can answer it. – rici Mar 20 '20 at 16:00