4

I have this in my code:

<a href = "res/pdf/sample.pdf">Sample PDF</a>

So basically it will appear to be a download link to file 'sample.pdf' but the problem is, there's a restriction in downloading this file. so whenever there are confidential reports uploaded and a malicious user accidentally memorized or viewed the URL of the download link in the browser history he can easily download it even without accessing the website because it is a direct link. What am i supposed to do so this link will be protected? or be downloaded only for the user assigned to it?

JJJ
  • 32,902
  • 20
  • 89
  • 102

2 Answers2

6

Don't serve up files by their direct URLs. Have a PHP script receive the filename of the file wanted, and serve it up.

So, if someone wants to download the above files, he would go to

example.com/getfile?file=sample.pdf

Your PHP script would check if the current user has permission to view the file, and then serve it up.

Make your links like this:

<a href = "http://example.com/getfile?file=sample.pdf">Sample PDF</a>

Your current method is very insecure for sensitive files. A malicious user could trivially write a script to download ALL files in res/pdf. All he needs to do is check every permutation of letters in the directory, and throw away all 404 errors.

You will not redirect the user since that would defeat the purpose. You will serve the file as a download with the appropriate Content-disposition header.

Here's an example: Fastest Way to Serve a File Using PHP

You can google and get many more examples.

Here's a great example that shows how to serve PDF files: https://serverfault.com/questions/316814/php-serve-a-file-for-download-without-providing-the-direct-link

Community
  • 1
  • 1
Ayush
  • 41,754
  • 51
  • 164
  • 239
  • so whenever I use this method. basically, there will be a redirection on the getfile.php. so on that getfile.php the URL will still be change?? can you please give me some example manipulation on the script where I will pass this GET values. I'm sorry I'm a noob – John Micah Fernandez Miguel Jun 01 '12 at 06:09
  • Sure, but I don't see how it's "trivial" to download all files of a given directory unless the files are listed when accessing the directory directly. You seem to underestimate the amount of file name permutations. – JJJ Jun 01 '12 at 06:09
  • One would just have to check for every permutation as a filename in that directory. This process could easily be parallelized and run for a few days. I'd say that's pretty trivial if the directory stores corporate secrets. – Ayush Jun 01 '12 at 06:11
  • @Juhana: I appreciate the amount of file name permutations, but if the data is valuable enough, one could easily rent some cloud computing time and have thousands of parallel threads run simultaneously. – Ayush Jun 01 '12 at 06:16
  • Let's say the file names are max 20 characters long and they have letters both upper and lower case and some punctuation, 70 different characters in total, and you can check 1000 file names per second. These are all very generous assumptions. Even then, checking all permutations would take about 2.53^26 years - the age of the universe is about 13.7^9 years. Even checking 10 character long names with 40 different characters (case insensitive) would take 332,500 years. – JJJ Jun 01 '12 at 06:18
  • @Juhana: You do have a point. Perhaps its not trivial to get ALL files, but it is very possible to get SOME files, which can be pretty bad as well. – Ayush Jun 01 '12 at 06:22
0

You can restrict using htaccess

Muthu Kumaran
  • 17,682
  • 5
  • 47
  • 70