I've published a website and, due to a misunderstanding not depending on me, I had to block all the pages before indexing. Some of these pages had been already linked on social networks, so to avoid a bad user-experience I've decided to insert the following code into "robots.txt"
User-agent: *
Disallow: *
I've received a "critical problem" alert on webmaster tools and I'm a bit worried about it. In your experience, would it be sufficient (whenever possible) to restore the original "robots.txt"? May the current situation leave consequences (penalizations or similar) on the website if it lasts for long time (and if it does, how can I fix it)?. I'm sorry if the question may sound a bit generic, but I'm not able to find specific answers. Thanks in advance.