45

Assuming I'm a big Unix rookie:

  • I'm running a curl request through cron every 15 minutes.

  • Curl basically is used to load a web page (PHP) that given some arguments, acts as a script like:

    curl http://example.com/?update_=1
    

What I would like to achieve is to run another "script" using this curl technique,

  • every time the other script is run
  • before the other script is run

I have read that curl accepts multiple URLs in one command, but I'm unsure if this would process the URLs sequentially or in "parallel".

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Riccardo
  • 2,054
  • 6
  • 33
  • 51

8 Answers8

79

It would most likely process them sequentially (why not just test it). But you can also do this:

  1. make a file called curlrequests.sh

  2. put it in a file like thus:

    curl http://example.com/?update_=1
    curl http://example.com/?update_=3
    curl http://example.com/?update_=234
    curl http://example.com/?update_=65
    
  3. save the file and make it executable with chmod:

    chmod +x curlrequests.sh
    
  4. run your file:

    ./curlrequests.sh
    

or

   /path/to/file/curlrequests.sh

As a side note, you can chain requests with &&, like this:

   curl http://example.com/?update_=1 && curl http://example.com/?update_=2 && curl http://example.com?update_=3`

And execute in parallel using &:

   curl http://example.com/?update_=1 & curl http://example.com/?update_=2 & curl http://example.com/?update_=3
Yuri
  • 4,254
  • 1
  • 29
  • 46
Timothy Baldridge
  • 10,455
  • 1
  • 44
  • 80
  • Timoty, will chained requests like curl http://mysite.com/?update_=1 && curl http://mysite.com/?update_=2 &&...... execute sequentially? – Riccardo Jun 24 '10 at 16:28
  • Yes, in most cases they will, but if one of them should end up with error (return value other than 0), the following will not be executed. – mbq Jun 24 '10 at 23:22
  • that's perfect for me. Thanks! – Riccardo Jun 25 '10 at 08:27
  • 1
    It doesn't work for me. In between two CURL commands run sequentially, I get this message: `Connection #0 to host 127.0.0.1 left intact`. And then: `Closing connection #0` – ecbrodie Mar 10 '13 at 06:43
  • Don't forget to add quote when url contains `&` – Alcalyn Jun 29 '23 at 15:41
30

According to the curl man page:

You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order.

So the simplest and most efficient (curl will send them all down a single TCP connection [those to the same origin]) approach would be put them all on a single invocation of curl e.g.:

curl http://example.com/?update_=1 http://example.com/?update_=2
Pierz
  • 7,064
  • 52
  • 59
20

Another crucial method not mentioned here is using the same TCP connection for multiple HTTP requests, and exactly one curl command for this.

This is very useful to save network bandwidth, client and server resources, and overall the need of using multiple curl commands, as curl by default closes the connection when end of command is reached.

Keeping the connection open and reusing it is very common for standard clients running a web-app.

Starting curl version 7.36.0, the --next or -: command-line option allows to chain multiple requests, and usable both in command-line and scripting.

For example:

  • Sending multiple requests on the same TCP connection:

curl http://example.com/?update_=1 -: http://example.com/foo

  • Sending multiple different HTTP requests on the same connection:

curl http://example.com/?update_=1 -: -d "I am posting this string" http://example.com/?update_=2

  • Sending multiple HTTP requests with different curl options for each request:

curl -o 'my_output_file' http://example.com/?update_=1 -: -d "my_data" -s -m 10 http://example.com/foo -: -o /dev/null http://example.com/random

From the curl manpage:

-:, --next

Tells curl to use a separate operation for the following URL and associated options. This allows you to send several URL requests, each with their own specific options, for example, such as different user names or custom requests for each.

-:, --next will reset all local options and only global ones will have their values survive over to the operation following the -:, --next instruction. Global options include -v, --verbose, --trace, --trace-ascii and --fail-early.

For example, you can do both a GET and a POST in a single command line:

curl www1.example.com --next -d postthis www2.example.com

Added in 7.36.0.

Pizza
  • 311
  • 2
  • 8
  • Is it possible to do this using a file with post data in it? Like I have a file with 100's of requests, 1 line per request, and I want to send all the requests to the same URL. – wryan Feb 17 '22 at 01:57
  • @wryan Absolutely! But you'll have to use some linux cli manipulations, for example using `xargs`, or cat the file with line by line till `EOF` is reached and so on. Basically you turn whatever comes after `:-` option to a variable that keeps feeding new parameters. – Pizza Feb 17 '22 at 21:34
4

I'm 13 years late to the party, but I have something new to add compared to all other answers here.

I realized you have a number at the end of the URL. I recently face the same issue and the number was a running number from 0 to 13. Here is how I solved it in one single line:

curl http://example.com/?update_=[0-13]

if you for example only want the even numbers, you can specify the jump:

curl http://example.com/?update_=[0-13:2]

Here is verbatim from the manpage of curl:

The URL syntax is protocol-dependent. You find a detailed description in RFC 3986.

You can specify multiple URLs or parts of URLs by writing part sets within braces and quoting the URL as in:

"http://site.{one,two,three}.com"

or you can get sequences of alphanumeric series by using [] as in:

"ftp://ftp.example.com/file[1-100].txt"

"ftp://ftp.example.com/file[001-100].txt" (with leading zeros)

"ftp://ftp.example.com/file[a-z].txt"

Nested sequences are not supported, but you can use several ones next to each other:

"http://example.com/archive[1996-1999]/vol[1-4]/part{a,b,c}.html"

Mehrad Mahmoudian
  • 3,466
  • 32
  • 36
3

I think this uses more native capabilities

//printing the links to a file
$ echo "https://stackoverflow.com/questions/3110444/
https://stackoverflow.com/questions/8445445/
https://stackoverflow.com/questions/4875446/" > links_file.txt


$ xargs curl < links_file.txt
Asclepius
  • 57,944
  • 17
  • 167
  • 143
user10089632
  • 5,216
  • 1
  • 26
  • 34
2

Write a script with two curl requests in desired order and run it by cron, like

#!/bin/bash
curl http://mysite.com/?update_=1
curl http://mysite.com/?the_other_thing
mbq
  • 18,510
  • 6
  • 49
  • 72
2

This will do what you want, uses an input file and is super fast

#!/bin/bash
IFS=$'\n'
file=/path/to/input.txt
lines=$(cat ${file})
for line in ${lines}; do
   curl "${line}"
done
IFS=""
exit ${?}

one entry per line on your input file, it will follow the order of your input file

save it as whatever.sh and make it executable

MitchellK
  • 2,322
  • 1
  • 16
  • 25
0

you can also use brackets {}

Let's say you want to curl:

a. wildfly_datasources_jdbc_total

b. wildfly_datasources_jdbc_currently

curl http://127.0.0.1:9990/metrics/vendor/wildfly_datasources_jdbc_{total,currently}