this is my robots.txt
. I want to only allow the base url domain.com
for indexing and disallow all sub urls like domain.com/foo
and domain.com/bar.html
.
User-agent: *
Disallow: /*/
Because I am not sure whether this is a valid syntax I tested it using Google Webmaster Tools. It shows me this message.
robots.txt file is probably invalid.
Is my file valid? Is there a better way of only allowing the base url for indexing?
Update: Google downloaded my robots.txt
4 hours ago. I think thats why it doesn't work. I will wait some time and if the problem stays I will update my question again.