0

I have a script that use the

header('Refresh: 5; url=http..');
die();

And i call this script with another php file that use the function "file_get_contents". Unfortunately it does not work. With header location there aren't problems.

Any suggestions?

-- UPDATES --

I have followed the advice of Oscargeek. I have updated the code with a print of HTML that contains meta-refresh. The script that call this url, is a "system" of cron, and make this call in a foreach. So i think it can't work. I have changed this call with a cron and wget, but the result is the same.

Other suggestion ?

Max
  • 1
  • 2

2 Answers2

1

When you are doing a file_get_contents, you get the HTML but not the headers of the first page.

file_get_contents only return a string without headers, header location it's working because are doing the redirection before return this string.

Try to do the redirect from the HTML, in your first page write this content:

<html>
    <head>
        <meta http-equiv="refresh" content="5; url=http://google.com" />
    </head>
</html>

In the PHP that you are calling, you should only print this content without other data and the refresh will do.

Nomad Webcode
  • 816
  • 5
  • 9
  • Hi, Thanks for the answer, I made some test. the PHP that are calling make the file_get_contents in a "foreach", so also if i print the HTML, it doesn't redirect. I use the php for calling because i have an internal "system" of cron, but i can call the final url also with "wget". So i have tested it with "wget" and it doesn't work. If i visit the final url via browser work without problems – Max Jul 16 '15 at 09:26
  • Sorry but i don't understand that are you need. If you please, can explain i'll do a better answer but that are you trying it's not a normal practice. You have a foreach, then you are getting the html of some page that's redirect to other page, the question is why don't get the html of the second web directly? – Nomad Webcode Jul 16 '15 at 10:42
  • Sorry for my english. I have a cron that call the first script, then the first script call ( with file_get_contents ) the second script that has the redirect with refresh. I made several test, but i think that the curl that i configured on cron ( unix shell ) can't follow redirect with refresh. – Max Jul 16 '15 at 10:57
  • ok, but my question is why you aren't doing the curl to de final url directly instaed of a previous url that are doing a redirection? – Nomad Webcode Jul 16 '15 at 13:26
  • Because we have an internal plugin that manage cron. However the problem is also the max_redirect of "file_get_contents". Now we have used a trick so for the moment we have solved. Thanks for the advice – Max Jul 17 '15 at 09:36
0

Okay, first of all, I'm wondering why you use file_get_contents to include a PHP-File. I'd use include or require.

For your problem some additional information:

The problem is: None of those headers ever was running in your other script. So this means IF they are send, they got send by the file you were trying to read - but: Since the script didn't got send via HTTP-Protocol, it doesn't send.

If you want to use it like this, I'd advice you to use HTML-Refresh like Oscargeek stated, or use Include / Require, if you want to keep PHP-Code.

SophieXLove64
  • 316
  • 1
  • 11
  • Maybe the first script is hosted in other server, for that reason it's necesary to do a file_get_contents to get the html, it's rare but plausible. If the script is in the same server the best practice is doing a include of course. – Nomad Webcode Jul 16 '15 at 08:43
  • Yup. I may be wrong, but isn't cross-domain file-reading disabled by default? I guess it was. [link](http://stackoverflow.com/questions/19024349/php-file-get-contents-does-not-work-from-server) – SophieXLove64 Jul 16 '15 at 08:49
  • usign file_get_contents don't have restrictions to capture other domains. You can do a `print file_get_contents('http://google.com')` for example – Nomad Webcode Jul 16 '15 at 08:59
  • Only because google allowed it, since they needed it themselfs for some services. I checked it on all of our webservers: reading files from outside of our network is disallowed. ;) – SophieXLove64 Jul 16 '15 at 09:05
  • try to see this please: http://downloads.oscargeek.com/test/sofsample/ if you can navigate on a web, you can capture it with file_get_contents.. the server can't do anything to restringe this. – Nomad Webcode Jul 16 '15 at 09:11
  • Guess Stackoverflow accepted this to happen. My Servers just give you a 403-Message while trying this. So the function just returns false. – SophieXLove64 Jul 16 '15 at 09:14
  • you can say any server that you think is restricted and i'll do the same :) Anyway, if the OP has to do a file_get_contents will be because it has no restriction – Nomad Webcode Jul 16 '15 at 09:15
  • We can never proof if its right or wrong. I just can say what I see on OUR servers. Maybe we blocked it - dunno, long time ago I installed PHP on them. You are nice, Oscar, nice to meet you :) – SophieXLove64 Jul 16 '15 at 09:17