0

in my site "www.website.com" i have a folder with forbidden access using this .htaccess :

RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
RewriteRule .* - [F,L]

Now i want to have access only from my site "www.website.com" to a specific PHP files using Ajax (POST). I have modify .htaccess like this:

RewriteEngine On
RewriteBase /
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://.*website.com [NC]
RewriteRule \.(php)$ - [NC,F,L]

But with this code i have access to all php files. I need to access only in *Add.php and *Edit.php . These files are in several subfolders. What should i do ? Is this the right way to do this?

ioaniatr
  • 277
  • 4
  • 15
  • 1
    To the server, the requests look exactly the same when a page is loaded directly vs by ajax. The server just sees a request from a browser. It doesn't know the difference. Nothing would stop someone from visiting your site and then loading the ajax page. Also, the referrer can be spoofed and should not be relied upon. – Jonathan Kuhn Oct 20 '14 at 18:45
  • so, what are you suggesting ? – ioaniatr Oct 20 '14 at 23:53
  • Also, the user logs-in and makes that ajax request. I need to handle unauthorized ajax requests in my php files. If i put inside my requests user credentials, are exposed. Even if i put encrypted password into. Maybe this could be another topic, but is it possible? – ioaniatr Oct 20 '14 at 23:58
  • 1
    There is no 100% fool proof way to stop people from loading the script. The goal is to make it more effort than is worth. If people are logging in, you can require that they have to be logged in to access the page pretty easy. But the login can be proxied through a scrpit that authenticates. If you need to allow both logged in and not access, your next best bet (for simplicity) would be a token authorization that is sent through a request header. But there is still nothing stopping curl from getting and setting that header. It just isn't as obvious when viewing the ajax requesst. – Jonathan Kuhn Oct 21 '14 at 00:03
  • If this is an API and you are trying to allow access to it, then just go look at some of the more popular APIs out there and see how they do it. Places like paypal, google...etc all have apis and they are pretty similar. – Jonathan Kuhn Oct 21 '14 at 00:05
  • This is not an API. Is more like administration area. I handle very easy only logged users to view that page with the ajax script that does the post. But the POST is exposed when requests are made. If someone find's out that can do POSTs to that "file" using ajax, i would have a problem! :-D So, i should look for token authorization ? – ioaniatr Oct 21 '14 at 00:33
  • Well, you would probably be fine just relying on the login. You can check a session variable and if not set just deny access. Or the token thing would work. You just generate a random key (md5 of microtime should be fine), store it in session and send it along with every request. – Jonathan Kuhn Oct 21 '14 at 17:17
  • I think i will generate a token using SESSION and User Credentials when the user logs-in, send it with every request, regenerate in my accessed php file, and finally compare. So, i will move the file outside of that folder and keep the first .htaccess . Do you think that using SHA to generate the token, would take much more time? – ioaniatr Oct 21 '14 at 21:58
  • sha is better, but longer. The two don't really make a difference in this scenario and as for time, you are talking fractions of a millisecond. – Jonathan Kuhn Oct 21 '14 at 22:02
  • Probably i will use SHA then. Besides i need it only ones. Then it's easy, i will access to all the required files using php include() depending on the POST variables from only one accessed php file. Thanks a lot! – ioaniatr Oct 21 '14 at 22:09
  • Do you think that the code in .htaccess `Order Deny,Allow` `Deny from all` would be better than my first posted .htaccess ? – ioaniatr Oct 22 '14 at 16:41
  • `Deny from all` will just block anyone from accessing the file, logged in or not. The htaccess will not be able to tell if the user is logged into php. It could tell if you were using a standard http based login such as `basic`, but chances are you aren't using that. – Jonathan Kuhn Oct 22 '14 at 17:12
  • Well, my logged-in users have access only to `index.php` . Then by pushing buttons and using ajax can have access to only one `ajax.php` file that routes by using PHP `include-require` to access any other php files. So, i think that in my case `Deny from all` could be better! Now it's clear, thanks. By the way, now that you mention it, i think that my HTTP authentication it is `basic` . I haven't noticed. I just have checked the packets using `wireshark` and the form is in plaintext. WOw!! How can i check this using PHP e.t.c echo ? Any suggestions for implementing secure HTTP authentication? – ioaniatr Oct 22 '14 at 19:50
  • Unless you use SSL (https), everything on every website is in plaintext. And as I said before, the server doesn't know the difference between an ajax request and directly loading the file. It just sees a request for a file. So if you are loading ajax.php via ajax, `deny from all` will be blocked. This only blocks web requests though. So server side stuff such as includes will still be able to access the file. – Jonathan Kuhn Oct 22 '14 at 20:22
  • Exactly! The .htaccess is in a subfolder with all my PHP files. Outside, in public_html directory is only index.php and ajax.php . I have one .htacces in every folder. The .htaccess under public_html is simple: `` `RewriteEngine On` `RewriteCond %{HTTP_HOST} ^www\.website\.com$ [NC]` `` I think i don't need anything else to it, do I ? – ioaniatr Oct 22 '14 at 20:51
  • I looked for HTTP authentication and ... no, in my site it's not HTTP `basic`, it's more robust, but not encrypted (ssl) also when user credentials are send from client to server. – ioaniatr Oct 22 '14 at 22:56
  • I believe that we figure out this! Could you please post an answer so i could mark it as answered? After all, you helped me to make it clear! – ioaniatr Nov 01 '14 at 03:10
  • while I have no problem posting an answer and collecting points, it would probably be better (and perfectly acceptable) to post what your final solution was as an answer and accept that. I only suggest that because while I did help, my comments likely won't help someone who is having similar trouble in the future as I don't know what the final solution was other than just guessing that you added a simple `deny from all` and include files from that directory. – Jonathan Kuhn Nov 03 '14 at 17:28
  • You 're right. I 'll do that. Let me ask one more question, you said that `HTTP_REFERER` can be spoofed. Other Server-Variables like `REQUEST_FILENAME` or `SCRIPT_FILENAME` or other can be easily spoofed too ? – ioaniatr Nov 04 '14 at 15:38
  • `REQUEST_FILENAME` and `SCRIPT_FILENAME` can't really be spoofed because they are set from the request to the server. While they can be changed to a different file, that would just make a request to the different file and result in a 404 page as that different file probably wouldn't exist. The reason `HTTP_REFERER` can be spoofed is that it is sent by the user and doesn't affect the request (it doesn't change the requested page). Similar to user agent (browser). It can be changed to whatever without affecting the request. The browser can set it to whatever it wants or not sent it at all. – Jonathan Kuhn Nov 04 '14 at 17:21
  • You can see the headers by opening developer tools (F12) and looking at the `Net` tab. Look at the request headers. Anything in there can be changed easily and so nothing there should be relied upon (it is potentially user input so bad data). However, some thing wouldn't work if changed like the host (changing host would just request a different website) or the URI (path and filename, would just request a different file). Something like referer is set by the browser and doesn't affect the request. Changing it or omitting it doesn't change the page you requested. – Jonathan Kuhn Nov 04 '14 at 17:25

1 Answers1

2

Well, after a lot of searching, reading and testing i conclude that the best way to secure a website that works using Ajax is:

1 - Have only one (if possible) file like router.php that will "route" depending on the POST/GET navigation variables, using includes to files that are in sub-folders.

2 - Except of SESSION-based authentication you could also implement Basic HTTP authentication and/or HTTPS (SSL) to secure user credentials when login. If you are not using HTTPs, you should use field or form encryption because in 'wire' all are in plaintext. I have found useful this http://www.itsyndicate.ca/jquery/

3 - In every POST use Token-based Authentication with a token that is created from user credentials, send over with the request and then re-calculate and compare.

4 - I have tried lot of combinations for using only one .htaccess in document ROOT but always something was missing, or miss-configured or not working as i expected. So, i found more simple to use one .htaccess in every sub-folder instead of one in the root. In sub-folders depending on what they contain's .htaccess should look like this:

### No Direct Access to All files ###
<IfModule mod_authz_host.c>
    Order Deny,Allow
    Deny from all
#   Allow from 127.0.0.1
</IfModule>

### One of the alternative ways with mod_rewrite ###
<IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteBase /
    RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]+)/.*\ HTTP [NC]
    RewriteRule .* - [F,L]
</IfModule>

### permit ONLY filetypes in a pattern ###
<IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteBase /
    RewriteCond %{REQUEST_FILENAME} ^(.+)\.(css|js|gif|png|jpe?g|woff)$
    RewriteRule .? - [S=1]
    RewriteRule ^(.+)$ - [F,L,NC]
</IfModule>

I prefer THE_REQUEST because this does not include any additional headers sent by the browser. This value has not been unescaped (decoded), unlike most other variables, and it has no point to spoof. I use skip [S=1] because i prefer to tell "If that then this rule not valid", so, in any other case the rule that "Denies" it is valid.

5 - For extra security you can use code inside php files, implementing one of the methods described in this article: Prevent direct access to a php include file

6 - Also, if you are Forbidding Image Hotlinking (and NOT only) described here , beware that referer can be spoofed!!

Community
  • 1
  • 1
ioaniatr
  • 277
  • 4
  • 15