It seems after 1019 or 1020 requests I'm getting blanks, bandwidth/server resources are not an issue.
function parallel($nodes) {
$start = microtime(true);
$node_count = count($nodes);
$curl_arr = array();
$master = curl_multi_init();
for($i = 0; $i < $node_count; $i++)
{
$url = $nodes[$i];
$curl_arr[$i] = curl_init($url);
curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl_arr[$i],CURLOPT_ENCODING,'gzip');
curl_setopt($curl_arr[$i], CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4 );
curl_multi_add_handle($master, $curl_arr[$i]);
}
do { curl_multi_exec($master,$running); } while($running > 0);
for($i = 0; $i < $node_count; $i++)
{
$results = curl_multi_getcontent($curl_arr[$i]);
//echo( $i . "\n" . $results . "\n");
}
echo "\n";
echo microtime(true) - $start;
}
Basically does anyone know of any settings that could be causing this problem? Or is there a limit within CURL (I don't think so)? It's not the site that I'm pulling data from either, google.com also blanks out at 1020.
All ideas appreciated and welcome.