Yesterday, I set up Elmah to send me an email every time an unhandled exception is thrown that causes my website to crash.
This morning, I woke up and saw 6 emails with the exception:
"System.Web.HttpException: The controller for path '/none' was not found or does not implement IController."
They were from the Bing crawler bot. There was 2 more from Google and Yahoo crawler bots.
I've gone through my code and tested all my links on my web site and haven't found any errors. Nowhere do I try to route to a 'none' controller. The other 2 errors were similar but point to a couple other controllers but they work fine when I test them.
What am I doing wrong and how can I get rid of these errors?
I'm concerned that they will rank my website lower in search engines.
Update:
I was finally able to reproduce the errors. In my browser, I typed in: "www.mysite.com/abcde." My site's custom 404 error page showed which is good. However, Elmah generated an email saying
"The controller for path '/abcde' was not found or does not implement IController"
When I typed in: "www.mysite.com/Home/abcde", my custom 404 error page showed. Elmah then sent this email:
"A public action method 'abcde' was not found on controller 'MySite.Controllers.HomeController'."
This leads me to conclude that for some reason, the web crawler spiders are trying to access URLs on my site that have been removed or never existed.
How can I get elmah to not log errors that are coming from URLs that don't exist on my site? I don't want to get an email every time a visitor or web crawler tries to key in a URL that doesn't exist.
Also, is there a way to route the most common, invalid web crawler URLs like '/none' and '/error_log' to my main home page?