1

I'm trying to build a basic "status" page using php that will tell my users if various services (webpages we use) are at least serving up pages (which isn't 100% guarantee of working but its pretty good indicator)

what i would like to do is something like

www.domainname.com/mediawiki/index.php and make sure that returns the page or not

I'm pretty new to php so im not even sure what function im looking for.

Thanks

Crash893
  • 11,428
  • 21
  • 88
  • 123
  • Do not forget whatever the solutions you take, you can "play" with timeouts. For example file_get_contents you can set the function timeout directly in the third parameter. See: http://php.net/manual/fr/function.file-get-contents.php – Cybrix Oct 20 '10 at 18:25

4 Answers4

5

There are ways to use built-in PHP functions to do this (e.g. file_get_contents), but they aren't very good. I suggest you take a look at the excellent cURL library. This might point you in the right direction: Header only retrieval in php via curl

Since you just want to see if a page is "up" you don't need to request the whole page, you can just use a HEAD request to get the headers for the page.

Community
  • 1
  • 1
Eli
  • 5,500
  • 1
  • 29
  • 27
  • 2
    file_get_contents is disabled on many shared hosting systems and it returns more data than you need to determine if the page is available. – Alan Geleynse Oct 20 '10 at 18:12
  • True but you'd be surprised how many scripts are (misguidedly) written to die on HEAD. Plus, you can set file_get_contents to perform a HEAD request. Either way, I'd suggest searching the returned source for a certain string for better determination. – webbiedave Oct 20 '10 at 18:14
  • 1
    @Alan and `cURL` is always enabled? I dont think so. – Cybrix Oct 20 '10 at 18:14
  • @Cybrix cURL is not always enabled either, but most places I have seen file_get_contents disabled, cURL is. If you have it available, cURL is usually a better choice due to its flexibility, but you have to just use what you have available on your system. – Alan Geleynse Oct 20 '10 at 18:17
  • 1
    @webbiedave That's a good point, you would have to see what you the script you are using returns. And if it is not a very large page, it won't make much of a difference anyways. – Alan Geleynse Oct 20 '10 at 18:18
0

Check out file_get_contents

It will return the web page source as a string. This way you could even search the string for a specific value, if so desired, for finer results. This can be very useful in case content is still returned but is some sort of error message.

$somePage = file_get_contents('http://www.domainname.com/mediawiki/index.php');
// $somePage now contains the HTML source or false if failed

Ensure allow_url_fopen = On in your php.ini

If you need to check the response headers, you can use $http_response_header

webbiedave
  • 48,414
  • 8
  • 88
  • 101
0

Try this:

<?php
    $_URL = "http://www.domainname.com/mediawiki/index.php";

    if (! @file_get_contents($_URL))
    {
        echo "Service not responding.";
    }
?>

Note that your php.ini must activate allow_url_fopen

Good luck

Cybrix
  • 3,248
  • 5
  • 42
  • 61
0

Another option would be to see of the socket is responding. (I can't remember where I got this from originally but it lets me know if port 80 is responding). You could always direct this to a different port.

function server($addr){
    if(strstr($addr,'/')){$addr = substr($addr, 0, strpos($addr, '/'));}
    return $addr;
};    

$link    = 'secure.sdinsite.net:';
$s_link  = str_replace('::', ':', $link);
$address = explode (':',"$s_link");
$churl   = @fsockopen(server($addrress[0]), 80, $errno, $errstr, 20);

if (!$churl) {
    $status = 'dead';
} else {
    $status = 'live';
};

echo $status;
Jason
  • 15,017
  • 23
  • 85
  • 116