I have a (Rails) site and I want the search engines to crawl and index it. However, I also have some actions that I want to log as having happened - and these actions can be triggered by logged in users as well as users not logged in. Now, to ensure that the count for non-logged in ie anonymous users doesn't include bot traffic I am considering a few options and am looking for guidance on which way to go:
Set a cookie for all users, if this cookie doesn't come back since Bots usually dont accept or send back cookies, I can distinguish bots from anonymous humans.
Check the header and see if the agent is a bot (some whitelist): How to recognize bots with php?
Set that action to be a POST rather than a GET. Bots issue GETs so they don't get counted.
Any other approaches?
I am sure folks have had to do this before so what's the 'canonical' way to solve this?