1

I have a script that makes calls to an XML api on a remote server. Currently my script sends 10 requests serially to the remote server. It fully processes each request before sending the next one. This is a huge bottleneck for my server at the moment since each API request can take up to a second each. Since most of the time is spent waiting for the remote server to respond, I'm wondering if/how I can send the requests in parallel so that all 10 requests use the same one second latency instead of ten one second latencies...

I thought about calling the script 10 times using a system command and running them in the background to effectively create 10 processes, but I'm not sure if that's the best way to do it. I figure this problem has probably been solved before.

user77413
  • 30,205
  • 16
  • 46
  • 52

4 Answers4

2

Yes, you can use curl. See here in the manual.

You can also use non-blocking I/O.

Artefacto
  • 96,375
  • 17
  • 202
  • 225
0

If you can use system command, then it is possible.

  1. Create php scripts what do the request and write the response data to files (10 files in total)
  2. Write a php script that call system command to run those 10 php script above, then wait for all executions is done
  3. Read the response data from files
Bang Dao
  • 5,091
  • 1
  • 24
  • 33
0

PHP doesn't support threads, but you use curl_multi.

That will send your requests in parallel. This is a good solution because most of the time you are waiting on the network anyway.

If you design carefully (use a queue for urls, issue a processing callback when each one is done) you won't have to wait until the longest request is finished

Byron Whitlock
  • 52,691
  • 28
  • 123
  • 168
0

What you're looking for is probably asynchronous calls. It can be done a couple of different ways in PHP depending on the version you're using. Some great information on it in this question: Asynchronous PHP calls?

Community
  • 1
  • 1
Justin Lucas
  • 2,301
  • 14
  • 22