-1

I'm running a script which will check the availability (10 times) of a domain name and output the domain, if available and a timestamp (with milliseconds).

Can you find anything which is slowing down the script even marignally? If you could please adjust and re-post or advise what can be done better, it would be very much appreciated! Thank you.

<?php

    date_default_timezone_set('Australia/Brisbane');
    $loops = 0; 

    function udate($format, $utimestamp = null) {
      if (is_null($utimestamp))
        $utimestamp = microtime(true);

      $timestamp = floor($utimestamp);
      $milliseconds = round(($utimestamp - $timestamp) * 1000000);

      return date(preg_replace('`(?<!\\\\)u`', $milliseconds, $format), $timestamp);
    }

    function GetCurlPage ($pageSpec)
    {
      $ch = curl_init($pageSpec);
      curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
      curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
      curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
      $tmp = curl_exec ($ch);
      curl_close ($ch);
      $tmp = preg_replace('/(?s)<meta http-equiv="Expires"[^>]*>/i', '', $tmp);
      $tmp = explode('<br>', $tmp);
      foreach ($tmp AS $line) {
        //echo '<pre>';
        //print_r($line);
        //echo '</pre>';
      }
      // Do something with each line.
      echo $tmp[0];
      echo "<br>";
      echo $tmp[1];
      //echo $tmp[2];
      echo "<br>";
      echo udate('H:i:s:u');
      echo "<br><br>";

      return $tmp;

    }

    while ($loops <= 10)    
    {
$suffixes=urlencode("com.au");
$domain = "sampledomain";
$fuzzysearch = "0";
$returnUrl="http://mydomain.com.au/test.php";
$url = "https://apidomain.com.au/check.php?domain=" .
$domain . "&suffixes=" . $suffixes . "&fuzzysearch=" . $fuzzysearch;
$output = GetCurlPage("$url");

    ++$loops;
    }           
?>
Jonah
  • 9,991
  • 5
  • 45
  • 79
iCeR
  • 77
  • 1
  • 7

3 Answers3

5

The slowness because you need to make 10 curl to external site

Two suggestions

  • update your test.php/check.php to allow multiple domain name check at one curl call (instead of checking one-by-one, pass an array)
  • use curl_multi_exec to allow parallel curl 10 different URLs at the same time

I would prefer suggestion 1

ajreal
  • 46,720
  • 11
  • 89
  • 119
  • @ajreal it seems like the best option! that with other users suggestions should do the trick :) thanks. Now how can I set up an array with the curl call for multiple domains? Sorry, first time user of curl – iCeR Dec 17 '10 at 18:18
  • @iCeR - such as `https://apidomain.com.au/check.php?domain1=abc.com&domain2=def.com&domain3=ghi...` **OR** `https://apidomain.com.au/check.php?domain[]=abc.com&domain[]=def.com&domain[]=ghi...` – ajreal Dec 17 '10 at 18:24
  • No luck coding it up :( sorry. Possible to assist editing my code as per above? – iCeR Dec 17 '10 at 18:51
  • @iCeR - You can build the query using [http_build_query](http://php.net/manual/en/function.http-build-query.php) . Do you have access to check.php? – ajreal Dec 17 '10 at 18:55
  • oh... try method 2 instead ... if you not able to update check.php, example of how to use [curl_multi-exec](http://stackoverflow.com/questions/1669541/replacing-do-while-loops) – ajreal Dec 17 '10 at 19:01
1
  • Don't put the code from $suffixes=urlencode("com.au"); until $domain . "&suffixes=" . $suffixes . "&fuzzysearch=" . $fuzzysearch; in the loop
  • Remove the empty foreach ($tmp AS $line) { loop
  • Don't do the regex stuff in udate and don't use a parameter there, instead let udate do it using string concatenation
thejh
  • 44,854
  • 16
  • 96
  • 107
0

Change if (is_null($utimestamp)) by if ($utimestamp === null) to prevent PHP from having to call the function is_null().

Marc-François
  • 3,900
  • 3
  • 28
  • 47