0

what is the best solution for fetch json with more than 100 url because the php script is too slow to do that

sure in the head of script I used set_time_limit(0);

I use this little bit code with cURL but it still slowly

 $curl_connection = curl_init($jsonurl);
 curl_setopt($curl_connection, CURLOPT_CONNECTTIMEOUT, 30);
 curl_setopt($curl_connection, CURLOPT_RETURNTRANSFER, true);
 curl_setopt($curl_connection, CURLOPT_SSL_VERIFYPEER, false);


 $data = json_decode(curl_exec($curl_connection), true);
 curl_close($curl_connection);

what do you think about this ?

SimpleojbC
  • 127
  • 2
  • 14
  • My thought on this? Change the logic to **not** load 100 JSON files? ..or do so in the background as a server-side script instead. – h2ooooooo Aug 14 '13 at 15:17
  • A json document with a few 100 URL inside shouldn't take all that long, it all depends on what you do with them. – Ja͢ck Aug 14 '13 at 15:21
  • Take a look at [this answer](http://stackoverflow.com/a/16431346/). At the end there is a multi `file_get_contents` example which runs pretty fast. – HamZa Aug 14 '13 at 15:24

2 Answers2

0

This is almost impossible to answer without more context, but it sounds like a job for a job queue and a cron job to process the queue periodically.

jeroen
  • 91,079
  • 21
  • 114
  • 132
0

You can investigate the use of curl_multi_* functionality. This will allow multiple parallel cURL requests.

Here is a simple PHP REST client I built that leverages curl_multi_*. Feel free to use it.

https://github.com/mikecbrant/php-rest-client

Mike Brant
  • 70,514
  • 10
  • 99
  • 103