8

I learned Why Request.Browser.Crawler is Always False in C# (http://www.digcode.com/default.aspx?page=ed51cde3-d979-4daf-afae-fa6192562ea9&article=bc3a7a4f-f53e-4f88-8e9c-c9337f6c05a0).

Does anyone uses some method to dynamically update the Crawler's list, so Request.Browser.Crawler will be really useful?

Kara
  • 6,115
  • 16
  • 50
  • 57
Click Ok
  • 8,700
  • 18
  • 70
  • 106

2 Answers2

11

I've been happy the the results supplied by Ocean's Browsercaps. It supports crawlers that Microsoft's config files has not bothered detecting. It will even parse out what version of the crawler is on your site, not that I really need that level of detail.

DavGarcia
  • 18,540
  • 14
  • 58
  • 96
6

You could check (regex) against Request.UserAgent.

Peter Bromberg wrote a nice article about writing an ASP.NET Request Logger and Crawler Killer in ASP.NET.

Here is the method he uses in his Logger class:

public static bool IsCrawler(HttpRequest request)
{
   // set next line to "bool isCrawler = false; to use this to deny certain bots
   bool isCrawler = request.Browser.Crawler;
   // Microsoft doesn't properly detect several crawlers
   if (!isCrawler)
   {
       // put any additional known crawlers in the Regex below
       // you can also use this list to deny certain bots instead, if desired:
       // just set bool isCrawler = false; for first line in method 
       // and only have the ones you want to deny in the following Regex list
       Regex regEx = new Regex("Slurp|slurp|ask|Ask|Teoma|teoma");
       isCrawler = regEx.Match(request.UserAgent).Success;
   }
   return isCrawler;
}
splattne
  • 102,760
  • 52
  • 202
  • 249
  • 6
    Warning - this is *not* fool-proof! If you install certain versions of the Ask.com toolbar (in IE, at least) it will modify the user-agent to include 'Ask' in some form, causing false-positives. – Kurt Schindler Sep 06 '10 at 23:10