0

The following script just seems to run forever. It never gets to finished.

$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);

for ($i = 500; $i<3000; i++){

    $url = "http://abcedfg.com/$i/index.html";

    curl_setopt($ch, CURLOPT_URL, $url);
    $response = curl_exec($ch);
    $httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
  • 1
    There is an error in your loop head: it needs to be `$i<3000` and `$i++`. – codedge Apr 29 '20 at 22:46
  • sorry, i have that right in my actual code so thats not it –  Apr 29 '20 at 22:49
  • @kwapster Then you should click the edit button below your question and make sure the code you pasted is aligned with the one you're actually running so that people trying to help you have the correct information to do so. – Zeitounator Apr 29 '20 at 23:04
  • 2
    just did that thanks –  Apr 29 '20 at 23:12

2 Answers2

1

Try to wrap curl_init and curl_close in every request.

Like this:

function callurl($myurl) {
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $myurl);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
    curl_setopt($ch, CURLOPT_NOBODY, true);
    curl_setopt($ch, CURLOPT_HEADER, true);
    $response = curl_exec ($ch);
    curl_close ($ch);
    return $response;
}

And You'll have to call this function for every URL for example using a loop for.

Also try to test with only 10-20 requests before to go BIG. Consider that 2500 requests, if every request takes 1 second, is translated to 41 minutes of activity. No server is configured by default to keep a PHP session active for 40min. You can change this settings on the server if You have access to the server.

It's also possible that You're stuck because the server doesn't have so much resources for making so much requests at the same time. Ideally You should fine tune Your server configuration in order to achieve better performance.

Also consider to use

curl_multi_init for better performance and asynchronous requests.

But this will not guarantee that the request will be dropped because of TIMEOUT. So fine tune the server could be still needed.

Check also this post for how to encrease the time Limit:

Claudio Ferraro
  • 4,551
  • 6
  • 43
  • 78
-1
  1. It's better to close the file, everytime you open it, so that it realese the memory for the open file.
  2. You can list all the urls by running the loop, and then do a multicurl request.