1

I'm trying to record visits to my site, but I keep getting crawlers accidentally setting off my code. Is there any way in rails controllers to determine whether a user is a bot such as googlebot, etc?

Rob
  • 7,028
  • 15
  • 63
  • 95
  • 1
    If this is casual, then you can inspect the user agent string, or create a valid robots.txt. If this is security, then the bot can easily lie about it being a bot. You can also require authentication for stuff you don't want to be seen. –  Dec 04 '13 at 20:18

1 Answers1

1

You can check HTTP headers, particularly the user agent string.

http://www.useragentstring.com/pages/Googlebot/

Most friendly bots have "bot" in their user agent.

Another suggestion is to use something like Google Analytics to track your visits. It's way better than implementing your own.

Bryan
  • 3,220
  • 3
  • 26
  • 31
  • any ideas how i could do this in routes.rb? – Rob Dec 04 '13 at 21:10
  • routes.rb is only for configuring the routing. You should just add a method with before_filter in your ApplicationController to check the user agent for every call and act accordingly. – Bryan Dec 04 '13 at 22:06