2

It seems after 1019 or 1020 requests I'm getting blanks, bandwidth/server resources are not an issue.

function parallel($nodes) {
    $start = microtime(true);
    $node_count = count($nodes);
    $curl_arr = array();
    $master = curl_multi_init();

    for($i = 0; $i < $node_count; $i++)
    {
        $url = $nodes[$i];
        $curl_arr[$i] = curl_init($url);
        curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true);
        curl_setopt($curl_arr[$i],CURLOPT_ENCODING,'gzip');
        curl_setopt($curl_arr[$i], CURLOPT_IPRESOLVE, CURL_IPRESOLVE_V4 );
        curl_multi_add_handle($master, $curl_arr[$i]);
    }

    do { curl_multi_exec($master,$running); } while($running > 0);

    for($i = 0; $i < $node_count; $i++)
    {
        $results = curl_multi_getcontent($curl_arr[$i]);
        //echo( $i . "\n" . $results . "\n");
    }
    echo "\n";
    echo microtime(true) - $start;
}

Basically does anyone know of any settings that could be causing this problem? Or is there a limit within CURL (I don't think so)? It's not the site that I'm pulling data from either, google.com also blanks out at 1020.

All ideas appreciated and welcome.

Sven Kahn
  • 487
  • 1
  • 3
  • 16
  • 2
    Why are you trying to user 1020 cURL sessions? I think the best thing to do is rather than fixing your server is design it so that you don't need to hold 1020 curl sessions at ONE TIME. – Allison Apr 17 '15 at 22:25
  • It seems to be the most efficient way to pull the data I'm pulling, it only takes 3 seconds to do the 1000 requests, it's that or run 6000 instances of a script in the background and use a lot of resources, I know it's not a great thing to do but it's the best solution so far – Sven Kahn Apr 17 '15 at 22:29
  • Try this http://stackoverflow.com/a/13873307/1166266 – Allison Apr 17 '15 at 22:44
  • I saw that and ya I think I'll look into it – Sven Kahn Apr 18 '15 at 02:28

1 Answers1

1

The problem ended up being the file limits that were set on the server

I updated the limits in /etc/security/limits.conf, I added the following:

*                soft    nofile          40000
*                hard    no file          40000
Sven Kahn
  • 487
  • 1
  • 3
  • 16
  • 1
    Could you go into more detail about how you solved your problem so we don't end up with no one knowing what you did – Allison Apr 20 '15 at 12:41