I'm not sure how good Google's robots.txt
tester is and wondering if the following example of my robots.txt
for me WooCommerce site will actually do the trick for blocking bots from adding to cart and crawling cart pages, while allowing good bots like Google to crawl the site and also block some bots that have been causing resource usage. Here's my example below with my **comments (comments are not included in the actual robots.txt
file):
**block some crawlers that were causing resource issues (do I need a separate "Disallow: /" for each one?)
User-agent: Baiduspider
User-agent: Yandexbot
User-agent: MJ12Bot
User-agent: DotBot
User-agent: MauiBot
Disallow: /
**allow all other bots
User-agent: *
Allow: /
**drop all allowed bots from adding to cart and crawling cart pages
Disallow: /*add-to-cart=*
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
Disallow: /cart/
Disallow: /checkout/
Disallow: /my-account/
crawl-delay: 10
Sitemap: https://www.example.com/sitemap.xml
I put this through Google's robots.txt
checker and it came out with 1 warning on the crawl delay, telling me it would be ignored.