1

I'm having trouble getting my php page to only run when it is requested by the server itself.

This is what I have right now:

if ($_SERVER['SERVER_ADDR'] == $_SERVER['REMOTE_ADDR']) {
        //process page
} else {
        $this->redirect('http://' . $_SERVER['HTTP_HOST'] . '/404');
}

However, when I curl it, it doesn't give any errors or return anything at all. If I remove the check, it spits out the HTML as expected.

I tried echoing both of those values and got 192.168.1.186, and 192.168.1.225 respectively. I do realize they are different (this is being run by the server itself), but how can I fix it? This code was from this S.O answer

Community
  • 1
  • 1
theintellects
  • 1,320
  • 2
  • 16
  • 28
  • 1
    few options, but i would lock it down with .htaccess if you can't put it outside the web root (always option 1) –  May 21 '13 at 21:53
  • How are you accessing the script? That might tell us why you are getting different IP addresses. Also, is there a reason you aren't using CLI rather than trying to access it via a local browser? – Devon Bessemer May 21 '13 at 21:52

2 Answers2

0

The title of your question implies that I can provide an answer which doesn't quite match the body of your question.

My response to this is that putting your entire script in a giant if statement seems somehow insecure and unmaintainable.

You would need a guard like this if it were possible for other computers to run your script, say by accessing it from the web.

But, if the server is the only machine which can run the script, why not just put it in a place where only the server can access it? For instance, one directory level above the web-accessible directory or in a child-directory with 700 permissions. Or use .htaccess to limit access.

That seems both safer and more maintainable.

Richard
  • 56,349
  • 34
  • 180
  • 251
  • I have attempted using .htaccess to limit it, but the script is dependent on a few controllers in the framework (CakePHP), so those controllers are only instantiated when I hit the URL and go through the routing. My attempts to limit the URL to localhost in htaccess failed. – theintellects May 21 '13 at 22:00
0

It is easier and better to use your server configuration to limit file access. You could for instance use a .htaccess file in you specific directory with these contents:

order deny,allow
deny from all
allow from 127.0.0.1

To deny all traffic except from 127.0.0.1 (localhost). This will return a 403 error when someone tries to access these files from another computer.

Qurben
  • 1,276
  • 1
  • 10
  • 21