2

Possible Duplicate:
How to detect fake users ( crawlers ) and cURL

some pages of my Website is getting crawled undesirably .

say pages like

abc.com/smarty/templates/1.html

abc.com/smarty/templates/2.html

abc.com/images/1.jpg

abc.com/images

  • I want to avoid indexing of these page
  • also want to remove these pages from google indexing.

I know to do this using apache setting.But since i am using shared network, i dnt hv access to these.

Please help

Community
  • 1
  • 1

3 Answers3

3

You can use a text file robots.txt, that the search motors find and which tells them what pages they can and can't index in your website.

Here is a good article about how to write this file: Robots.txt: What it is, Why it’s Used and How to Write it


To remove a page from Google was discussed here

Community
  • 1
  • 1
Sami
  • 648
  • 8
  • 25
0

in robots.txt add rules like

User-agent:    *
Disallow:   /smarty/*
Disallow:   /images/*
chicharito
  • 1,047
  • 3
  • 12
  • 41
0

www.google.com/webmasters/tools/

Pic of webmaster tools!

You can fetch pages as google You can remove pages from google's index you can upload your sitemap and much more. I think this is exactly what you are looking for.

crh225
  • 821
  • 18
  • 34