I'm trying to block all bots/crawlers/spiders for a special directory. How can I do that with htaccess
? I searched a little bit and found a solution by blocking based on the user agent:
RewriteCond %{HTTP_USER_AGENT} googlebot
Now I would need more user agents (for all bots known) and the rule should be only valid for my separate directory. I have already a robots.txt but not all crawlers take a look at it ... Blocking by IP address is not an option. Or are there other solutions? I know the password protection but I have to ask first if this would be an option. Nevertheless, I look for a solution based on the user agent.