-1

i have a directory public/1 where all the PDF, CSV files get uploaded by end users and they are very confidential files.

now the url is publicly exposed such as: http://www.example.com/1

how can i protect that specific URL not being exposed to public network?

2 Answers2

2

Close it by using .htaccess ?

Add it to robots.txt so it doesnt get indexed ?

Add a index.html to it so it isnt a open dir ?

wrickdotnl
  • 42
  • 4
  • 1
    2 and 3 are nowhere near a good enough measure to *protect sensitive files*. – deceze Aug 11 '15 at 09:47
  • but its very recommended to do :) – wrickdotnl Aug 11 '15 at 10:01
  • 1
    If you configure your web server to return a 404 or 403 for anything in that folder, or not put those files on the web in the first place, you don't need a robots.txt or index.html. – deceze Aug 11 '15 at 10:09
1

You can do this with htaccess. Create a file named .htaccess with following content. Make sure to replace uploaddir with correct directory name.

<Directory "uploaddir/">
deny from all
</Directory>

Also I suggest you to use encryption before storing the files. So If someone managed to download the file he can't use it.

Harikrishnan
  • 9,688
  • 11
  • 84
  • 127