2

I have some PHP scripts on my server which I use for periodic cron jobs (for daily rapports and updating leaderboards for example).

To prevent outsiders from running these scripts manually (by running http://url/script.php in the browser for example) I included the following code to check the IP-address before running the actual script. Where XX.XX.XX.XX represents the IP address of my own network.

    $remote = isset($_SERVER["REMOTE_ADDR"]) ? $_SERVER["REMOTE_ADDR"] : '127.0.0.1';
    $whitelist = array('XX.XX.XX.XX', '127.0.0.1');

    if (!in_array($remote, $whitelist))
    {
        exit;
    } 

So now I have the following questions:

  • How safe is this?
  • What are the risks?
  • How can I make this more safe?
  • Are there other (better) solutions?

PS. My previous questions was closed because someone thought this question is a duplicate of PHP IP Address Whitelist with Wildcards. But I this is not the case! That question is about using wildcards in whitelisting while this questions is about the safety and risks of this solution.

The Stompiest
  • 318
  • 2
  • 15
  • 2
    Why do you put php files in public space at all? All the more so files made by cron. In the event of a server crash they can be displayed as txt. Your security is weak if the IP is shared. – Slawomir Dziuba May 08 '20 at 14:49
  • To be honest, I didn't even know this was an option. – The Stompiest May 08 '20 at 15:00
  • @SlawomirDziuba is this also possible when I use a VPS? – The Stompiest May 08 '20 at 17:15
  • 1
    Both problems are not related to the server type. PHP works as a preprocessor which means that if the www server gateway doesn't work, the php page is sent with mime-type html/txt. (for example, you lost your MySQL password). In public_html there should be only a symbolic link to the main file of the application or if you can't place symlinks script with included application files from outside public_html and static files. Then, in the event of a failure, no or irrelevant information is disclosed. – Slawomir Dziuba May 08 '20 at 17:44
  • So would it be enough to place the PHP scripts that I run for the cron jobs outside of the httpdocs folder? And how about other resources like classes and interfaces? – The Stompiest May 08 '20 at 22:12
  • 1
    Yes. Cron scripts run php-cli so there is no reason for them to have public access. All active files should be outside the public directory. Especially .inc files with passwords etc. – Slawomir Dziuba May 09 '20 at 06:10
  • Do you think the ip check will then be redundant for cron job scripts? So then I only will do the ip checks on pages (like leaderboards) that I only want to be viewed by people within my network. Is this the way I should design it? – The Stompiest May 09 '20 at 08:59
  • 1
    Since scripts cannot be accessed via www, why check IP addresses in them? It is difficult to talk about the application design without knowing all the details. I'd rather use iptables or htaccess or htpasswd to manage access to sensitive data. – Slawomir Dziuba May 09 '20 at 09:44
  • Thank you so much for giving me these insights! There were very helpfull! If you can summarize them in an answer I would be happy to accept it! – The Stompiest May 09 '20 at 11:51
  • 1
    Thanks. I wrote the answer. – Slawomir Dziuba May 09 '20 at 13:21

2 Answers2

1

The presented method is not completely secure.

PHP acts as a text preprocessor, which means that in the event of a web server gateway error, the script content can be sent with the mime-type text/html, which is a risk of revealing sensitive data such as passwords to SQL database or (s)ftp accounts.

Administrative scripts placed in public also carry the risk of their unauthorized execution if the IP address controlled in the script was a shared (or dynamically transmitted) address. Cron scripts are executed using php-cli, therefore the web server gateway is not needed for anything and IP analysis in the script becomes unnecessary if it is outside the public directory.

Remote execution using e.g. curl could be the only reason for placing administrative scripts in the public space of the www server. This is usually a weak solution because then the script performs the php interpreter (and not php-cli) with other settings, usually with drastically limited execution time. However, if it is for some reason necessary it should be in a separate directory to which access is limited to specific IP addresses using .htaccess (and/or .iptables) and with assigned username and password by using htpasswd (Basic Auth).

The ideal situation is when the public directory of the www server (hereinafter referred to as public) contains only static content (img, css, js ... files) and the application trigger located in the parent directory. Example structure is:

/home/username/domainname/(apps,crons,public,tmp)

The apps directory should contain all application files and directories. The public directory should contain only static content (for order in some subdirectories) and a symbolic link to the main file of the application which can be obtained with the command:

ln -s ../apps/app.php index.php

Some server configurations do not allow the use of symlinks. Then you can use the index.php file containing:

<?php
include('/home/username/domainname/apps/app.php');

This solution is a bit worse because in the event of a gateway failure, the directory structure is revealed. However, sensitive data is still secure because the web server cannot display the content of files that are not there.

The presented IP analysis can be used to display part of content for authorized addresses, assuming that the php file itself is outside the public web server. If these are entire websites, however, I would prefer to use iptables or .htaccess to manage access to them.

Slawomir Dziuba
  • 1,265
  • 1
  • 6
  • 13
  • Thanks for the clear answer! You not only gave me a clear insight of where to store resources, but also provided a better solution for the part of the application which I want to make accessible for certain users (by using .htacces and/or htpasswd). – The Stompiest May 09 '20 at 13:49
  • Nice to hear. Good luck with your implementation. – Slawomir Dziuba May 09 '20 at 13:56
1

How safe is this?

Realistically, it's pretty safe as long as you're in control of the address (127.0.0.1 is okay, XXX.XXX.XXX.XXX might not be). By pretty safe I mean that there's little chance that someone might abuse this system and not have a far greater chance of abusing the rest of the web application.

What are the risks?

Someone might call your script from outside, if they had a way of assuming the IP address XXX.XXX.XXX.XXX in some way, or tricking the systemm into believing they had.

How can I make this more safe?

You can include a secret in the original call, and check it against a hash of the same secret. The secret is not revealed even if someone can read the script.

if (!array_key_exists('key', $_GET)) {
    die('Access denied');
}
if (sha1($_GET['key']) !== '713dca7cf928f23a2347cae828d98879629e1e80') {
    die('Access denied');
}

You can also place the script outside the web root, and call it through a require statement. This way, either the PHP subsystem works, and the script cannot be read, or it does not work, and all that's revealed is the name of an unaccessible directory. You can even merge the two approaches:

if (sha1($_GET['key']) !== '713dca7cf928f23a2347cae828d98879629e1e80') {
    die('Access denied');
}
$realScript = $_GET['key'];
require $realScript;

Now, the only script that can be included is the one whose name has that specific SHA1 hash, and no other (the risk of collisions is realistically negligible: you would need a collision with a valid filename, and the means of creating such a filename). So you know that the script is valid, but unless the name is supplied in the call, the whole construction will not work, and it will not even tell an attacker why.

curl http://yoursite.internal.address/cron/cron.php?key=../scripts7ab9ceef/mycron.php

Are there other (better) solutions?

Yes and no. Calling the script using the command line interface is just as safe, and does not need a working webserver. It also allows running as a different user if needed.

On the other hand, it requires a command-line interface to be installed, which may create other security issues, and even if the same user and home directory is used, the two interfaces might still behave subtly differently (or not so subtly: you might have a PHP7.3 web module and a PHP5.2 CLI installation, or vice versa, which would make a script with short array syntax (or with constructs like if(empty(some_function($_GET['x']))) not even load in one or the other interface.

All in all, the crontab call to curl or lynx is probably more maintainable and simple to use, even if it is undoubtedly less efficient.

LSerni
  • 55,617
  • 10
  • 65
  • 107