0

I am designing a utility for my end-user where he can bookmark a website, but before bookmarking the website I built a PHP function to check the website availability status and notify about the downtime if the website is down. The following code snippets are used to check the domain availability using PHP cURL and show the status of the website.

the issue that I am facing is the response always comes as false. I don't know the reason. any advice?

<?php

$URL = 'https://www.google.com';

if(isSiteAvailible($URL)){
    echo 'The website is available.';      
}else{
   echo 'Woops, the site is not found.'; 
}
function isSiteAvailible($url){
    // Check, if a valid url is provided
    if(!filter_var($url, FILTER_VALIDATE_URL)){
        return false;
    }

    // Initialize cURL
    $curlInit = curl_init($url);
    
    // Set options
    curl_setopt($curlInit,CURLOPT_CONNECTTIMEOUT,10);
    curl_setopt($curlInit,CURLOPT_HEADER,true);
    curl_setopt($curlInit,CURLOPT_NOBODY,true);
    curl_setopt($curlInit,CURLOPT_RETURNTRANSFER,true);

    // Get response
    $response = curl_exec($curlInit);
    
    // Close a cURL session
    curl_close($curlInit);

    return $response?true:false;
}
?>

When I added curl_error() it tells me

"Recv failure: Connection was reset"

Why?

ADyson
  • 57,178
  • 14
  • 51
  • 63
mhussein76
  • 43
  • 1
  • 7
  • 3
    [`curl_error()`](https://www.php.net/curl_error) after `curl_exec()` if `$response === false`. – Syscall Feb 02 '22 at 08:40
  • does it work with "http"? If so, there is an issue with your curl+ssl setup. – Salman A Feb 02 '22 at 08:40
  • when i added curl_error() it tells me "Recv failure: Connection was reset" . why? – mhussein76 Feb 02 '22 at 08:45
  • @salman . what could be the issue with curl ? – mhussein76 Feb 02 '22 at 08:46
  • Check this (if you didn't google it already): [CURL ERROR: Recv failure: Connection reset by peer - PHP Curl](https://stackoverflow.com/questions/10285700/curl-error-recv-failure-connection-reset-by-peer-php-curl). It gives you lots of the possible causes, so you can investigate. – ADyson Feb 02 '22 at 09:27
  • P.S. The concept of a website being "available" is really a flawed one anyway. All your curl request would prove is whether curl could connect to that site _from that server_, and _with those headers_, and _at that exact moment in time_. It doesn't show whether a site is "down" or not. It doesn't prove that anyone else, from another location, and another http client, at a moment in the future, will be able to connect successfully or not. In short, it's a pretty futile exercise. – ADyson Feb 02 '22 at 09:30
  • @ADyson: if i may ask what is the best way to validate if the site is down or not because i don't need my end user to fill his profile with websites that are no anymore active – mhussein76 Feb 02 '22 at 10:40
  • Technically, there is no way to do that with a single request. The only thing you can learn from the failure of a single request is that a single request has failed! You run a big risk of false positives - e.g. deciding a site is down or removed, when in fact there was just a temporary network problem or something. If you make a number of requests over a period of time and it repeatedly returns certain types of error then you might surmise that the site is down or no longer exists, but it's up to you to set a threshold of what errors would count and what you think consititutes enough evidence. – ADyson Feb 02 '22 at 10:50
  • `i don't need my end user to fill his profile with websites that are no anymore active`...to be honest, isn't that the user's problem, not yours? They can check this for themselves if the really wish to. And just because the site existed on the day they added it, doesn't mean it will still exist the day after. It's not really clear what problem you're trying to solve with this...the web is full of dead links, but it's up to the people who create the content containing those links to maintain it accurately. At worst though, you simply get some dead links, it's annoying but not disastrous. – ADyson Feb 02 '22 at 10:53
  • @ADyson : Is there any solution rather than using cURL to avoid the complexity behind it? – mhussein76 Feb 02 '22 at 11:10
  • What do you mean exactly? What complexity specifically are you referring to? cURL is just a HTTP client. You could replace it with another HTTP client to make the same requests, but fundmentally it's not a case of cURL vs anything else, it's a conceptual problem in the whole idea that you're trying to implement just doesn't make much sense, and doesn't seem to add much value to your website. Again, what problem exactly are you trying to solve? Surely your users are responsible for the content they upload, not you? – ADyson Feb 02 '22 at 11:15

0 Answers0