13

How can one deny access to all subdirectories of a given directory? (While allowing to manually modify the access rights for single items in the directory tree.)

I tried to do it with the <Directory(Match)> directives. The server configuration (000-sites-enabled) looks like this:

DocumentRoot /var/www
<Directory /var/www>
    Allow from all
    Deny from none
    Order deny,allow
</Directory>
<Directory /var/www/*>
    Deny from all
</Directory>

A query to http://localhost/ successfully displays /var/www/index.html and all queries to any subdirectories fail.

The problem is: any query to a file in the httproot fails - i.e. requesting http://localhost/index.html will result into 403 Forbidden.

The <Directory(Match)> directives seem to actually match directories AND files!?

To see if this is true, i tried:

<Directory /var/www/i*>
    Deny from all
</Directory>

This denies access only to files/directories starting with 'i'.

Is there a way to alter this behaviour and let <Directory> match only directories? Is there another way to accomplish that all the subdirectories are denied? (besides denying all of them manually or enabling all files manually)

coldfix
  • 6,604
  • 3
  • 40
  • 50
  • Looks like a bug to me. I opened https://issues.apache.org/bugzilla/show_bug.cgi?id=50926 so someone with more experience in the core can take a look. – covener Mar 14 '11 at 12:30

6 Answers6

19

in the end, the solution turns out to be pretty simple:

<Directory /var/www/*/>
    Allow from None
    Order allow,deny
</Directory>

Note the trailing slash / after the directory pattern, which will make it match only directories, not files!

This works exactly like we would expect from the <Directory>-directive - in that it denies access only to the direct subdirectories of /var/www/. Specified subdirectories (anywhere in the tree) can still manually be re-enabled with <Directory> directives.

This is in contrast to <DirectoryMatch> which will
- also match all files & directories in the tree and
- override all <Files> or <Directory> directives for any item in the tree.

coldfix
  • 6,604
  • 3
  • 40
  • 50
  • You say, "it denies access only to the direct subdirectories", but this is not how Directory works, AFAIK. It applies to the named directories AND to their subtrees. I can't find any Directory-like directives that can apply ONLY to the named directories but not to their subtrees. I also can't find Directory-like directives that apply ONLY to the subtrees but not to the named directories. Seems to be a limitation of Apache. – David Spector Sep 06 '18 at 18:25
  • @DavidSpector It applies to the subtrees only as a default because the parent one is forbidden, and this is perfectly desirable. However, you can freely add `` directives for any subdirectory to reenable them, e.g. `` will override the above setting. – coldfix Sep 07 '18 at 13:34
  • Sorry, maybe I wasn't clear. My comment was a general one, where we assume that all directories are accessible. In such a more typical case, I can't find any Directory-like directives that can apply ONLY to named directories but not to their subtrees. I also can't find Directory-like directives that apply ONLY to subtrees but not to the named directories. Your comment is also true. – David Spector Sep 07 '18 at 17:18
3

This did it for me.

<DirectoryMatch "^/var/www/(.+)/"> # don't put $ at the end
Order Allow,Deny
Deny From All
</DirectoryMatch>


EDIT

For not denying sub-subdirectories (comment below), add this DirectoryMatch below the one above in your configuration file:

<DirectoryMatch "^/var/www/(.+?)/(.+)/"> # again no $, see comment
Order Deny,Allow
Allow From All
</DirectoryMatch>
dialer
  • 4,348
  • 6
  • 33
  • 56
  • I tried that (with $). Do you know, why it isn't working with $ ? Without, it will also match all sub-subdirectories, which isn't what I initially intended. (I wanted to have the server-locations generated by the filesystem; disabling all subdirectories by default and enable just specific ones. But now I see, this might not be the best approach.) – coldfix Mar 14 '11 at 00:23
  • The $ needs to be missing, otherwise it would only match empty directories, but not any files including index.php files. You can either listen to what the other answers say, or, you can see the edit I will post there in a minute. – dialer Mar 14 '11 at 11:01
  • 1
    $ does not work in DirectoryMatch in 2.2.x, because of some odd design decision. It's in the manual. – covener Mar 14 '11 at 11:57
  • 1
    According to the manual any _`` directives [..] will apply only to the named directory and sub-directories of that directory (and the files within)_ anyway. Regarding your suggestion: using several `` directives seems to work - but I am hesitating to use this, since it will override all .htaccess or `` settings located anywhere down the tree. @covener: Ahh.. didn't see that. Thanks. – coldfix Mar 14 '11 at 19:28
1

Use this:

<Directory /var/www/public>
    allow from all
</Directory>


<DirectoryMatch "^/var/www/public/(.+)/">
   deny from all
</DirectoryMatch>

You might want to add Options etc.

The trick is how the directives are merged.

Derick Schoonbee
  • 2,971
  • 1
  • 23
  • 39
  • Note: You can do this for the default virtual host or application specific... I've tested this on my Ubuntu box. If you allow listing (Indexes) the subdirs will be nicely hidden but you'll see all the files in the directory. I believe this is what you want. – Derick Schoonbee Mar 20 '11 at 14:45
0

You can disable Auto Indexing in all sub-directories, by removing the Indexes option from Options directive inside configuration file, so for default configuration the Options directive should looks something like:

httpd.conf:

...
Options FollowSymLinks
...

(no "Indexes" option set.)

And then, put index.html or index.php file inside each particular sub-directory you want to be available for client access. If you want to auto indexing be enable in some particular directory, you could add a .htaccess files inside those directories and put this line inside the .htaccess file:

Options Indexes

Note that .htaccess will effects on its directory and all of its sub-directories recursively, so you should exclude any recursive sub-directory that you don't want this option on it, by adding .htaccess and disabling auto index by:

Options -Indexes

Note: To .htaccess files be enable and take affect on apache configurations, you should AllowOveride All on the directory matches you want to place .htaccess file.

2i3r
  • 423
  • 1
  • 3
  • 13
0

So, I have 2 thoughts that might be of help (or not).

The first is that locations can override your directory permissions. So make sure you don't have those. hitting localhost/ is hitting whatever you have set up as root, which is probably overriding your security. That's why if you specify the file directly, you cant' get to it. So, if you don't want people to be able to reach your root, you should not specify a root.

As for your point about restricting access to subdirectories, I would check out this other post. ... maybe not helpful. Perhaps more details into your use case would help.

https://serverfault.com/questions/190447/apache-2-htaccess-matching-all-sub-directories-of-current-directory

Community
  • 1
  • 1
John Hinnegan
  • 5,864
  • 2
  • 48
  • 64
  • So instead of restricting access to all subdirectories, I should inverse my directory structure using ``, so that basically only permitted directories are reachable? Sounds sensible. Any clue why `` matches files? – coldfix Mar 13 '11 at 23:56
  • I think the idea is more that you put things that are restricted together in a common place. I doesn't make a lot of sense to set up a scheme where you can access /foo but not /foo/bar. More sensible to me would be to be able to access /foo, but not /bar. I think that last statement is pretty subjective, just my $.02 – John Hinnegan Mar 14 '11 at 00:48
0

The best approach is to move all content not available to the public to a directory out of the root tree like to /home/my/app/

<Directory /home/my/app>
    Order Allow,Deny
    Deny from all
</Directory>

Then give read permission to the Apache user in that directory and in all directories that lead to that one, say, /home and /my

This way there is no risk of some of that content to leak when some root directory configuration error occurs.

Clodoaldo Neto
  • 118,695
  • 26
  • 233
  • 260
  • If you move all configuration/content to an extern tree - all your scripts need to know its root path (which might change and result into a lot of repetetive work). How would you deal with that? Consider also the case, when you want to hide (HTML) content only from some users, so creating an additional tree is not an option. – coldfix Mar 14 '11 at 19:51
  • What do you want to hide? scripts, static files? Is this an application or just a bunch of scripts? That makes all the difference because in an application there would be one only script exposed. This one script would pass the control to other scripts which would handle authentication and authorization and serve content according to the business rules. In any case placing forbidden content under the web root is an accident waiting to happen. I don't understand your points, like root path and repetitive work. – Clodoaldo Neto Mar 14 '11 at 21:07