I'm considering whether I should put following robots.txt
for my "sorry server" that returns some sorry message to our customer that we are under maintenance.
User-agent: *
Disallow: /
So here's my concerns/questions:
Won't it tell crawlers to not to index our site forever in spite of our server is ready after the maintenance is done?
If I put the
robots.txt
for my sorry server, should I put anotherrobots.txt
for our regular server that tells crawlers to "please index our site"?[EDIT] Speaking of extremes, Won't it delete our site from Google?