0

To protect a field where users can submit feedback on a sort of forum-like page, I use the API of stopforumspam.com to compare the IP of the visitors against the stopforumspam blacklist.

However, sometimes, stopforumspam is down, for maintenance of because spammers are DDOSing the domain. And that makes the loading of that page take almost forever.

My current code uses the

try { } 
catch(Exception $e) { }

method.

In full detail :

$visitorip=getip();

try
{
        // using code from http://guildwarsholland.nl/phphulp/testspambot.php to try and block spammers
        $xml_string = file_get_contents('http://www.stopforumspam.com/api?ip='.$visitorip);
                $xml = new SimpleXMLElement($xml_string);
                if($xml->appears == 'yes'){
                    $spambot = true;
                    file_put_contents("list.txt" , date('c').", ".$visitorip."\n", FILE_APPEND);
                    $spambot_info = $ip.',';
                    die("I'm sorry but there has been an error with the page, please try again tomorrow or contact the admin if your report can't wait, thank you!");
                } 
}
catch(Exception $e) 
{
        echo 'Error connecting to the main antispam checking database, please send an email to the admin through the main contact page, if that problem lasts for more than a pair of hours, THANK YOU !! <br>Here is the complete error message to report : <br>' .$e->getMessage();       
}

Where this is imperfect: when stopforumspam is down, there will be a full 45-seconds loading time with a blank page, before the catch() error message shows up and my page finally loads. Server wait time, max php execution time, or standard wait delay, most likely.

Would you know how I can shorten the time during which my script attempts to connect (before throwing an error) to max 10 seconds ? Much appreciated!

EcchiOli
  • 802
  • 9
  • 11
  • Instead of using `file_get_contents ()` to retrieve the data from the remote server use `cURL` with a timeout set. –  Feb 14 '15 at 09:17

1 Answers1

0

You have some different options. The answer to your question as stated is:

Make a request and wait to see how long it takes.

But you're already doing that, and it's not helping. I think what your really want to know is this:

How can I avoid making my users wait while I interact with a third-party that may be slow?

Set a timeout for your request

You will need to set a timeout on you connection. I would recommend using curl with CURLOPT_CONNECTTIMEOUT.

$ch = curl_init();

$url = 'http://example.com/';
$timeout = 5;

curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);

$result = curl_exec($ch);
curl_close($ch);
// Now you can work with $result

Note that you will probably want to check for errors with curl-errno

For more information, see the curl manual and this question.

Cache the results

I don't know how often stopforumspam is updated, but you could easily write their page to a local file and check against that when needed. Then you just need to read a local file which will be much faster. For updating the cached version, you can either setup a scheduled task (cron) or check the modified time of the cached file on requests.

Combine both of these

Your best option will likely be some combination of both of these techniques.

Community
  • 1
  • 1
Jon Surrell
  • 9,444
  • 8
  • 48
  • 54