0

i have some questions on how folders and files permissions work. say i have users directories outside 'protected' as below..

users
  -- usera
    -- docs
  -- userb
    -- docs
protected

i do not want user B who does not have the rights, to access anything in user A directories. also, i do not want any person to access the directories directory via url links. basically i just want users to be able to access their own directories, and no one else. how can it be done?

thanks!

twb
  • 1,248
  • 4
  • 18
  • 31
  • 1
    This needs more context. Access how? What is the relevance of "`protected`"? – deceze Nov 25 '11 at 02:48
  • access as in read or download the files. protected is a directory that has .htaccess 'deny from all' for hosting php scripts. – twb Nov 25 '11 at 03:25
  • This is still somewhat confusing since you talk about "permissions", which is usually a file system thing. You seem to be talking about more abstract permissions when accessing files through the web server though, or are you? – deceze Nov 25 '11 at 03:46
  • sorry for the confusion. i am referring to permission as in authentication. authenticated user should only be able to access his files and folders only. likewise, any web users should not be able to access any users files/folders via urls. – twb Nov 25 '11 at 03:54

2 Answers2

1

Without more info to go on, my suggestion would be to make sure the user directories are above the web root. This will prevent them from being linked to. Then create a PHP script that validates that a user is who they say they are. Once you know the identify of a logged in user, you can use fpassthru() (http://php.net/fpassthru) or similar to deliver the docs to the user.

Daniel
  • 1,151
  • 1
  • 9
  • 15
  • hi, what other info will be needed? is there other ways because i will think that using passthr will put alot of processing burden on the server? – twb Nov 25 '11 at 03:02
  • There are some interesting comments on the fpassthru PHP page relating to speed and server resource usage. Using fread is probably actually faster. You should scan through the comments on that page for some good tips. If you don't want to use a PHP function, you can look at something like mod-auth-token http://code.google.com/p/mod-auth-token/ – Daniel Nov 25 '11 at 03:18
  • mod-auth-token looks good! but i cant get this on shared hosting? – twb Nov 25 '11 at 03:24
  • On shared hosting your best bet is probably going to be fread or fpassthru. You _might_ be able to get something working where you generate a random token yourself, then write to an htaccess file in the user's directory to allow that token access to the files. But that seems a lot of hassle for not much benefit over fread. You might consider setting up a load test using Charles http://www.charlesproxy.com/. Fiddler http://www.fiddler2.com/ might be able to do it as well with an extension. – Daniel Nov 25 '11 at 03:59
1

I answered a simular question here limiting users to subdirectories which you should be able to adjust to suit your needs, I've copied it here as well.

Download.php

<?php
/** Load your user assumed $user **/

$file = trim($_GET['file']);

/** Sanitize file name here **/

if (true === file_exists('/users/user'.$user->id.'/'.$file)) {
   //from http://php.net/manual/en/function.readfile.php
   header('Content-Description: File Transfer');
   header('Content-Type: application/octet-stream');
   header('Content-Disposition: attachment; filename="'.$file.'"');
   header('Content-Transfer-Encoding: binary');
   header('Expires: 0');
   header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
   header('Pragma: public');
   header('Content-Length: ' . filesize($file));
   ob_clean();
   flush();
   readfile($file);
   exit;
} else {
   throw new Exception('File Not Found');
}

.htaccess To deny all direct file downloads

deny from all

You would then link to the folders by using /download.php?file=filename.ext and it would only download that file from the users directory of the current user.

You'll want to ensure you sanitize the input file name so you're not vulnerable to directory transversal exploits.

Community
  • 1
  • 1
Jason Brumwell
  • 3,482
  • 24
  • 16
  • hi, will this put extra burdens on the server? such that 100 users are downloading different 1mb files (100x100mb) at a time will cause the memory or processing power to cripple?? – twb Nov 25 '11 at 03:10
  • yup..i think i will use this approach, together with xsendfile. cheers – twb Nov 25 '11 at 06:33