I have written some rules to block few URL in Robot.txt Now i want to varied those rules. Is there any tools for verifying robot.txt
?
I have written this rule:
Disallow: /classifieds/search*/
to block these URLs:
http://example.com/classifieds/search?filter_states=4&filter_frieght=8&filter_driver=2
http://example.com/classifieds/search?keywords=Covenant+Transport&type=Carrier
http://example.com/classifieds/search/
http://example.com/classifieds/search
I also want to know what is the difference between these rules
Disallow: /classifieds/search*/
Disallow: /classifieds/search/
Disallow: /classifieds/search