0

I am attempting to use the file_get_contents function to print the contents of an image url on the screen:

<?php
$image2 = "http://www.example.com";
echo file_get_contents( $image2 );
?>

When run, the page takes about 15-20 seconds to load, then displays nothing. I've also attempted to use cURL, which gave the same result. Anyone have any suggestions on how to fix this?

Here is the curl code that I tried:

<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://www.example.com");
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
$contents=curl_exec($ch);
curl_close($ch);
echo $contents;
?>

When run, the page keeps loading until the server cancels the request.

Titan9251
  • 35
  • 1
  • 6
  • 1
    There are some concerns with error handling when you use "file_get_contents is not". if it works, it works, if it doesn't, it may take time to find out why. In this case you can try to replace it with `curl` that can give you more details about what the issue is. – Axalix Oct 24 '15 at 21:03
  • send proper `Content-Type: image/jpeg` header to the browser. Maybe default header is `text/html`, so browser can't understand it. – Deadooshka Oct 25 '15 at 05:30
  • You should do retrieval of images on the client side, just return the URL. Further, closing the PHP tag (`?>`) will cause linebreaks and other code after that to be appended to the image, which is not what you want and why PSR coding guidelines forbid it. Lastly, how exactly do you determine that it returns an empty string? – Ulrich Eckhardt Oct 25 '15 at 06:36

2 Answers2

0

file_get_contents like many PHP functions, returns false in case of failure. false is not echoed to the output, which is why you don't see anything, as if an empty string was echoed.

Store the result in a variable, and check with the === operator to check for errors.

$file = file_get_contents($image2);
if ($file === false) {
  // Check the error and handle as you like.
} else {
  echo $image2;
}

Unfortunately, that just tells you whether there is an error, but not which one. Fortunately, you've used cUrl as well. There is also curl_error and curl_errno, which you can use to get a more detailed description of the error itself. Usually that will clear things up right away.

My suspicion at this point: Usually I would think that failure of file_get_contents is related to the setting of allow-url-fopen, but now, the delay of 20 seconds, and the fact that cUrl shows the same behavior, indicates you may get a timeout. It could be your firewall that doesn't let you perform requests like that.

GolezTrol
  • 114,394
  • 18
  • 182
  • 210
  • It's running on a free-host server, could that be the cause? – Titan9251 Oct 24 '15 at 21:14
  • 1
    Could be, but you should try `curl_error()` (when using the cUrl version) to get extra error information. `file_get_contents` won't show you that information. – GolezTrol Oct 25 '15 at 05:08
0

Given that you mentioned your script is running on a free host server.

Most free hosting servers have file_get_contents() function disabled. Check your Help section of the free hosting server, I'm very confident it'll be mentioned there that it is disabled.

dchayka
  • 1,291
  • 12
  • 20