0

I'm building a web server out of a spare computer in my house (with Ubuntu Server 11.04), with the goal of using it as a file sharing drive that can also be accessed over the internet. Obviously, I don't want just anyone being able to download some of these files, especially since some would be in the 250-750MB range (video files, archives, etc.). So I'd be implementing a user login system with PHP and MySQL.

I've done some research on here and other sites and I understand that a good method would be to store these files outside the public directory (e.g. /var/private vs. /var/www). Then, when the file is requested by a logged in user, the appropriate headers are given (likely application/octet-stream for automatic downloading), the buffer flushed, and the file is loaded via readfile.

However, while I imagine this would be a piece of cake for smaller files like documents, images, and music files, would this be feasible for the larger files I mentioned?

If there's an alternate method I missed, I'm all ears. I tried setting a folders permissions to 750 and similar, but I could still view the file through normal HTTP in my browser, as if I was considered part of the group (and when I set the permissions so I can't access the file, neither can PHP).

Crap, while I'm at it, any tips for allowing people to upload large files via PHP? Or would that have to be don via FTP?

Doug Wollison
  • 401
  • 6
  • 15

3 Answers3

1

You want the X-Sendfile header. It will instruct your web server to serve up a specific file from your file system.

Read about it here: Using X-Sendfile with Apache/PHP

Community
  • 1
  • 1
Joe
  • 46,419
  • 33
  • 155
  • 245
  • There is more information about this approach here: http://www.php.net/manual/en/function.readfile.php#103837 – 0x6A75616E Aug 20 '11 at 15:31
  • That should be perfect, it seems more direct than setting headers and reading the file, correct? – Doug Wollison Aug 20 '11 at 16:40
  • Far more direct. A webserver is designed and optimised to serve up files as efficiently as possible, so you should let it do its job! Wriging your own file server is just duplicating code that already exists. Also, it can things you might not think of such as using the appropriate compression for the client, getting the Content-Type right, etc etc. – Joe Aug 20 '11 at 19:01
0

That could indeed become an issue with large files.

Isn't it possible to just use FTP for this?

HTTP isn't really meant for large files but FTP is.

PeeHaa
  • 71,436
  • 58
  • 190
  • 262
  • True, and personally I will be, but some of the users I have in mind might not be the most tech savvy with FTP. Wait, I can upload a file in PHP via FTP as oppose to the usual HTTP, correct? – Doug Wollison Aug 20 '11 at 16:38
0

The soluton you mentioned is the best possible when the account system is handled via PHP and MySQL. If you want to keep it away from PHP and let the server do the job, you can protect the directory by password via .htaccess file. This way the files won't go through the PHP, but honestly there's nothing you should be worried about. I recommend you to go with your method.

Sebastian Nowak
  • 5,607
  • 8
  • 67
  • 107