0

I have a large site (40,000+ pages) and want to minimise the level of pages that Google is Indexing, i.e. I wish to index only 500 pages.

I can only seem to find the option of Noindex'ing pages and for me to Noindex 39,500 pages wouldn't make sense.

An example of my Robots.txt file.

User-agent: *
Noindex:  /category/long-site-url-1/
Noindex:  /
Noindex:  /site-url-2/
Noindex:  /site-url-3/

Sitemap: https://sitedomain/sitemap.xml
Owen O'Neill
  • 107
  • 1
  • 3
  • 13

1 Answers1

0

Noindex in robots.txt is/was only an experimental feature (and never documented/specified).

You should add the meta-robots element in HTML or send the X-Robots-Tag header in HTTP for each affected page:

<meta name="robots" content="noindex" />
X-Robots-Tag: noindex
unor
  • 92,415
  • 26
  • 211
  • 360