The small website I am managing for my club is currently seeing an increased amount of attack attempts. Until now those attacks have not been very sophisticated and therefore nothing happened (yet). Now I just want to make sure that the way we request content is as protected as possible and not a huge hole in our defence. For that I think I need additional changes to the code, but don't really know where to start.
The setup:
We load our content pages by using $_GET arguments to tell which site to get and then get parsed to a database to get more data for the specific site. This is all done using a rewrite rule so we don't have to use index.php?arg1=this&arg2=that
as our URL but can use /pages/arg1/1rg2
The code:
The rewrite rule looks as follows:
RewriteRule ^pages/([^/]*)/([^/]*)$ /index.php?arg1=$1&arg2=$2 [B,NE,L]
The code to then request the data from the database:
$statement = $connection->prepare("SELECT * FROM `navigation` WHERE `Name` = :arg2");
$statement->bindParam(":name", $name);
$statement->execute();
$result = $statement->fetch();
Now what I think i need to do is make additional checks for the arg2 string before I hand it to the database request, since I'm not sure if the binding actually prevents injections, but I am not sure what to check for as the string itself could contain special characters like
à l' ù
and likewise symbols like
; . _ ( ) -
so I can't just check if it is letters. I though about using regex to check if that string matches anying, like so:
if (!preg_match("/^([0-9a-z_\s; ._à'ù()-])+$/i",$arg2)) {
echo "Non valid string";
exit;
}
But I am not sure if this is enough to prevent code injection to either the php request or the database or at wore even both? Do you guys have any tips or idea on how to close it down even more, so that only valid entries (i.e. non code text) will actually result in a query to the system? Or did I already do that and there is - for now - nothing more I can do?