I use this multi curl wrapper: https://github.com/php-curl-class/php-curl-class/
I'm looping through ~160 URLs and fetch XML data from them. As I realize the curl requests are done in parallel. The strange thing is that if I put small timeout (say, 10 seconds) more than half of URLs couldn't be handled: I received error callback with error message of Timeout was reached
.
However if I set the timeout to be 100 seconds, almost all URLs are handled properly.
But I cannot understand why this happens. If I use a single Curl instance and fetch data from any of the URLs I got response pretty quickly, it doesn't require 100 seconds to fetch data from a single URL.
So the purpose of multi curl is doing requests in parallel. Every request has it's own timeout. Then, if the timeout is set to small value (10-20-30 seconds), then why it turns out that it's not enough?
Later I'll have ~600 URLs, which would mean that the timeout probably should be increased to 400-500 seconds, which is weird. I could as well create a single Curl instance and do requests one by one with almost the same results