0

I'm using a script to detect which domains are still working from a file that I uploaded via a form.

The structure of the file is domain:description on each line (about 20 lines)

I'm using the following code to go trough each line of the file and test each domain

$file = file($_FILES['products_file']['tmp_name']);
$count = count($file);
if($count > 0)
{
    $good = '';
    $bad = '';
    foreach($file as $line)
    {
        $line = trim(preg_replace('/\s+/', ' ', $line));
        $data = explode(":", $line);

        $test = testURL("https://".$data[1]."/");
        if ($test !== 404 && !empty($test)) {
            $good[] = $line;
        } else {
            $bad[] = $line;
        }
    }
}

My problem is that i get this error Maximum execution time of 30 seconds exceeded on the line with the CURL function curl_exec($ch)

This if my testURL function with the problem

function testURL($url) {
    $ch = curl_init();
    curl_setopt($ch, CURLOPT_URL, $url);
    curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
    curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
    curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, false);
    curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
    $data = curl_exec($ch);
    curl_close($ch);

    $httpCode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
    if($httpCode == 404) {
        return 404;
    }

    if (empty($data)) {
        return "";
    }

    return $data;
}

To resolve this problem I want to know if there is a way to make the CURL function cancel it's operation if the URL is not responding within 5 seconds and return an empty string.

Paradox
  • 91
  • 1
  • 8
  • I tried setting the `CURLOPT_TIMEOUT` to 5 seconds but i still get the error `Maximum execution time of 30 seconds exceeded` – Paradox Jul 02 '15 at 14:50
  • Your error message is a PHP error, not a CURL error. Increase PHP's execution time: http://stackoverflow.com/a/5164954/1682509 – Reeno Jul 02 '15 at 14:57
  • @Reeno But if i increase PHP's execution time I'll have to wait about 10 minutes for my 20 lines from my file. I want to go trough them quick. – Paradox Jul 02 '15 at 14:59
  • Then either set CURL's timeout to a very low value or call the PHP multiple times and everytime only process some of the URIs. You can't speed up CURL, only decrease the timeouts, but then you won't get the return status for all of the requests – Reeno Jul 02 '15 at 15:02

1 Answers1

0

Take a look at CURLOPT_TIMEOUT: https://php.net/curl-setopt

Syscall
  • 19,327
  • 10
  • 37
  • 52
GeoffreyB
  • 1,791
  • 4
  • 20
  • 36