1

I'm using the simplehtmldom parser to parse URL contents.

It contains the following line of code:

$contents = file_get_contents($url, $use_include_path, $context, $offset);

file_get_contents returns a warning when the URL does not exist. How can I handle all possible errors for such case, i.e. 404 and all other possible errors.

Warning: file_get_contents(http://www.something-not-valid-bla-bla-1234.com/): failed to open stream: php_network_getaddresses: getaddrinfo failed: Name or service not known in simple_html_dom.php on line 76
Abhineet Verma
  • 1,008
  • 8
  • 18
uxcode
  • 375
  • 1
  • 3
  • 17
  • 1
    i would not do with either of the answers below suggest, i would start by switching to curl, then handling its for more meingful output in a more intelligent way. –  Jul 03 '14 at 03:20

2 Answers2

3

Put an @ in front of file_get_contents and then check if $contents === false;

$contents = @file_get_contents($url, $use_include_path, $context, $offset);
if ($contents === false) {
  //Something went wrong.
}

EDIT: The @ suppresses the error from being dumped out.

1

I prefer to do them this way, ( single assignment )

if( false === ( $contents = @file_get_contents($url, $use_include_path, $context, $offset))) {
  //Something went wrong.
}

Also take note of the strict === equality operator as opposed to just ==

@Update I agree with Dagon about cUrl, as you are accessing content -via- a url, even using file_get_contents may not work depending on your php settings. for further info I found this related question, the explanation is quite a bit lengthy tho and would require some changes in the implementation.

Get content from a url using php

Community
  • 1
  • 1
ArtisticPhoenix
  • 21,464
  • 2
  • 24
  • 38