0

In the process of moving an application from ColdFusion to PHP, I have a ColdFusion server running on CentoOS using apache. Despite a correct robots.txt disallowing the indexing of my application it has come attention that some files from the clients were indexed.

I need to know how to set up apache to only allow access to file from the server itself and NOT allow anyone to access the files from the inter-google. SO if you were to click the link it would deny access, but if you were to attempt to download it from the application itself (using a download script) it would allow it to download. Is it possible and how?

LOVE that the search engine ignored my robots.txt. Thanks!

Wally Kolcz
  • 1,604
  • 3
  • 24
  • 45
  • [What have you tried?](http://www.whathaveyoutried.com) What research have you done? [SO](http://stackoverflow.com/questions/9507645/htaccess-deny-from-all) – UnholyRanger Jun 04 '13 at 22:06
  • ` Order allow,deny Allow from *Server IP Address* ` – Wally Kolcz Jun 04 '13 at 22:16
  • Do I have to put each directory or lets say I have a path that is mysite.com/users/information/files and I only put '/users' will it also block everyone from the folders inside? – Wally Kolcz Jun 04 '13 at 22:21
  • you could always use an `.htaccess` file at the root of where you wish to block with the line `deny from all`. Your download script should then send the file as an attachment (assuming script and file are local to each other) – UnholyRanger Jun 04 '13 at 22:35
  • Will .htaccess work with a ColdFusion server? – Wally Kolcz Jun 05 '13 at 01:04
  • `.htaccess` is an apache thing. So if apache is handling the requests, it should work – UnholyRanger Jun 05 '13 at 01:27

0 Answers0