I just ran into a robots.txt that looks like this:
User-agent: *
Disallow: /foobar
User-agent: badbot
Disallow: *
After disallowing only a few folders for all, does the specific badbot
rule even apply?
Note: This question is merely for understanding the above ruleset. I know using robots.txt is not a proper security mechanism and I'm neither using nor advocating it.