0

I use this multi curl wrapper: https://github.com/php-curl-class/php-curl-class/

I'm looping through ~160 URLs and fetch XML data from them. As I realize the curl requests are done in parallel. The strange thing is that if I put small timeout (say, 10 seconds) more than half of URLs couldn't be handled: I received error callback with error message of Timeout was reached.

However if I set the timeout to be 100 seconds, almost all URLs are handled properly.

But I cannot understand why this happens. If I use a single Curl instance and fetch data from any of the URLs I got response pretty quickly, it doesn't require 100 seconds to fetch data from a single URL.

So the purpose of multi curl is doing requests in parallel. Every request has it's own timeout. Then, if the timeout is set to small value (10-20-30 seconds), then why it turns out that it's not enough?

Later I'll have ~600 URLs, which would mean that the timeout probably should be increased to 400-500 seconds, which is weird. I could as well create a single Curl instance and do requests one by one with almost the same results

Victor
  • 5,073
  • 15
  • 68
  • 120
  • Possible duplicate of [What is the maximum number of cURL connections set by?](https://stackoverflow.com/questions/13850951/what-is-the-maximum-number-of-curl-connections-set-by) – Patrick Q Oct 24 '17 at 13:16
  • https://curl.haxx.se/libcurl/c/CURLMOPT_MAX_TOTAL_CONNECTIONS.html – Alex Blex Oct 24 '17 at 13:21
  • @JoseManuelVillasanteArmas Are you open to using a different PHP library? It use curl in background, but a much cleaner interface – Tarun Lalwani Nov 01 '17 at 05:23

1 Answers1

0

Curl and PHP can't async a request

instead of using fetch in a JavaScript way you got second third ... chance to do the job with promises. however to add this native function to php with curl you go somes ways already talked here Async curl request in PHP and PHP Curl async response