4

My robots.txt is following:

User-agent: *
Disallow: /user/*
Disallow: /invitations/*
Disallow: /api/*

#Adsense crawler
User-agent: Mediapartners-Google
Disallow:

Is the "Mediapartners-Google" crawler really allowed to scan all the pages? Or the first line of my robots.txt User-agent: * will be preventing all crawlers (including "Mediapartners-Google") from accessing mentioned roots above even if there are following lines in the end of file:

User-agent: Mediapartners-Google
Disallow:

In other words - is order of rules matters in robots.txt - like in my situation?

unor
  • 92,415
  • 26
  • 211
  • 360
Sid
  • 4,302
  • 3
  • 26
  • 27

2 Answers2

5

It seems i have found the answer. And the answer is - order does not matters, because:

In a robots.txt file with multiple user-agent directives, each disallow or allow rule only applies to the useragent(s) specified in that particular line break-separated set. If the file contains a rule that applies to more than one user-agent, a crawler will only pay attention to (and follow the directives in) the most specific group of instructions.

Source: https://moz.com/learn/seo/robotstxt

Sid
  • 4,302
  • 3
  • 26
  • 27
1

Order does not matter and you can test it by using Search console in robots section. Change the order and test few pages, see if any thing is strange.