0

I found several program over the internet which can grab your website and download the whole website on your pc. How one can secure your website from these programs?

Link: http://www.makeuseof.com/tag/save-and-backup-websites-with-httrack/

user288645
  • 331
  • 1
  • 4
  • 15
  • I am developing a site in Dotnetnuke version 6.0 – user288645 Aug 20 '13 at 06:47
  • Why would one want to prevent that? If the user wants to download your site (which you made public) into local html files you cannot prevent that anyway (he can always save single pages from his browser). So why block an automated approach? – arkascha Aug 20 '13 at 06:50
  • possible duplicate of [How to prevent unauthorized spidering](http://stackoverflow.com/questions/449376/how-to-prevent-unauthorized-spidering) – Merlyn Morgan-Graham Aug 20 '13 at 06:53

1 Answers1

0

You have to tell whether the visitor is human or bot in the first place. This no easy task, see e. g. : Tell bots apart from human visitors for stats?

Then, if you detected what bot it is, you can decide wether you want to give it your website content or not. Legitimate bots (like Googlebot) will conveniently provide their own userAgent id; malicious bots / web crawlers may disguise themselves as common browser programs.

There is no 100% solution, anyway.

If you content is really sensitive, you may want to add captcha, or user authentication.

Community
  • 1
  • 1
  • I have added captcha and user authentication but captcha is not on every webpage. @arkascha = For security purpose so that no one can duplicate my contents. – user288645 Aug 20 '13 at 08:08