1

I have recently been tasked with opening a sub-domain of a small-scale website that allows trusted users to upload and store files as backups away from their local disks. In general, this new sub-domain will be used to store mainly static Office-type documents (e.g. Microsoft/OpenOffice documents, PDFs, plain text files etc).

I want to ensure that there is a no chance (or realistically, as little chance as physically possible) of a naïve user inadvertently uploading a potentially fatal file, e.g. some nasty PHP script. Ideally, I would like to close down PHP and anything else that could be a security risk (CGI processes, Server-Side Includes etc).

I was just wondering if the Stack Overflow community could help to answer the following question: What is the best way of shutting down all file types/processes that deal with dynamic/executable code, so in effect the the sub-domain is nothing more than a basic static file server?

I have had a look on Google using various keywords/phrases, but I cannot seem to find a good reference to making a sub-domain “safe” as it is possible to do from somebody with a shared server's level of authority.

The website runs on Apache 2.2 on a typical LAMP architecture, and is hosted on a third party shared server.

I DO have access to:

  • .htaccess (directory level with typical privileges/restrictions)
  • php.ini and .user.ini (directory level with typical privileges/restrictions)
  • control panel software with some fairly generous options and features (cPanel X)
  • fairly flexible web hosts with an excellent tech support service

I DO NOT have access to:

  • root access (obviously!)
  • httpd.conf
  • php.ini (application server level)
  • mysql.cnf

Please bear in mind that I do not have the resources to just invest in a file server or outsource it to a third-party service. Also, this server is not going to be used in a CDN sense, so performance is not really an issue.

(Also, I don't know what can be done about client-side scripts, e.g. JavaScript/VBScript, but any suggestions would be welcome.)

Thanks in advance!

Jordan Clark
  • 750
  • 5
  • 18

1 Answers1

1

Simple. Don't provide direct access to the files. Run EVERYTHING through a PHP script, which serves up the content as application/octet-stream. e.g.

<?php

$id = $_GET['id'];
$data = get_file_details_from_database($id);
if (user_is_allowed_to_access($id)) {
   header('Content-type: application/octet-stream');
   readfile($data['path_to_file_on_server']);
}

With this it doesn't matter AT ALL what kind of file they upload - it will never be directly accessible via a direct http://example.com/nastyfile.php-type URL. And if you store the files on the drive using only their internal ID number, and not their user-provided filename, you gain even more security. A webserver might try to execute nastyscript.php, but if it's just 12345 on the drive, the server won't know what to do with it.

Marc B
  • 356,200
  • 43
  • 426
  • 500
  • Thanks, Marc. This is a very useful suggestion, and I will be certain to keep the PHP snippet above in my "toolbox" as it were. However, as I mentioned, this website is running on effectively a hobbyist's I would not have the resources (and admittedly, the technical ability!) to store files by their internal ID number. To be more specific, are there any directives that you could enter at a shared server user's level of authority, say in my root `.htaccess` or local `php.ini` file(s)? – Jordan Clark Jul 31 '13 at 14:47
  • what resources do you need? a simple database to record the file's metadata (who uploaded, original filename, auto_increment ID number) and you're done... – Marc B Jul 31 '13 at 14:49