1

I am currently using curl to download json files, the problem I am facing is the time it takes to finish the task. My database has over a 1000 url's that curl has to load and use the result of, how can I do this synchronously with max 10 requests at the same time.

Question: How can I make 10 curl request synchronously?

$query = "SELECT url FROM sources WHERE cached = 0";
$result = $mysqli->query($query);

foreach($result as $url) {
   //Make curl request...
}
  • Why don't you try to use **file_get_contents($url)** instead of curl? – Vixed Feb 14 '16 at 15:14
  • What benefit would that give me? Curl is faster, and file_get_contents also makes me run into the problem of only loading one at a time... –  Feb 14 '16 at 16:06
  • 1
    Look at this question http://stackoverflow.com/questions/9308779/php-parallel-curl-requests, this article http://www.codediesel.com/php/parallel-curl-execution/ and this article http://phplens.com/phpeverywhere/?q=node/view/254 – quasoft Feb 14 '16 at 16:09
  • I will take a look, thanx –  Feb 14 '16 at 16:10
  • can I please ask you what you do with the result? Saving files, placing in db, or just print? – Vixed Feb 14 '16 at 16:10
  • I load for example 2000 nicknames from a database, for every nickname a curl request is done to a specific url to get a json file with information like scores that are saved into a database with UPDATE. –  Feb 14 '16 at 20:30

0 Answers0