0

I am a novice user.
I have installed an SSL certificate on my WordPress / woocommerce site about a month ago. The HTTP version of the site had only been in place since late December. I am in the process of setting up a Google Merchant Centre shopping feed and received an alert back from Google letting me know that they were having trouble crawling my site images due to an issue with my robots.txt file. I checked in Google WMT and found that my https property is fine with no issues but that my old HTTP property is listed as having "severe health issues" related to the robots.txt file. When I run the robots.txt tester I get the following

script:  
User-agent: *
Disallow: /
Crawl-delay: 10 

Should I delete my old HTTP properties from Google WMT or is there something else I should do e.g. 301 redirect from HTTP to https; fix HTTP robots.txt file; other?

Again - apologies, I'm a bit of a novice so I'm hoping someone can guide me through the right steps to take.

Chandrashekhar Swami
  • 1,742
  • 23
  • 39
Roddeo
  • 1
  • 2

1 Answers1

0

You should 301 redirect all your http traffic to https versions.

Google should get the redirect next time it crawls your site.

However you are blocking Google using robots.txt so it isn't able to crawl and so won't be able to see your redirects. There is no reason to do this - at least not until Google has replaced all your pages with https version (if it ever does!).

Barry Pollard
  • 40,655
  • 7
  • 76
  • 92