I'm seeing a lot of exceptions in an app -- that was converted from an off the shelf ecommerce site -- a year ago, when a spider hits routes that no longer exist. There's not many of these, but they're hit by various spiders sometimes multiple times a day. I've blocked the worst offenders (garbage spiders, mostly), but I can't block google and bing obviously. There are too many URL's to remove manually.
I'm not sure why the app doesn't return a 404 code, I'm guessing one of the routes is catching the URL's and trying to generate a view, but since the resource is missing it returns nil, which is what's throwing the errors. Like this:
undefined method `status' for nil:NilClass
app/controllers/products_controller.rb:28:in `show'
Again, this particular product is gone, so I'm not sure why the app didn't return the 404 page, instead it's trying to generate the view even though the resource doesn't exist, it's checking to make sure the nil
resource has a public status, and the error is thrown.
If I rescue for Active:Record not found, will that do it? It's kind of hard to test, as I have to wait for the various bots to come through.
I also have trouble with some links that rely on a cookie being set for tracking, and if the cookie's not set, the app sets it before processing the request. That doesn't seem to be working with the spiders, and I've set those links to nofollow links, but that doesn't seem to be honored by all the spiders.