2

Google announced today (may/28/2014), that JavaScript content will be rendered by google bot. WOW Great news! So there is no need to serve some pre-rendered pages for crawling purposes. Read more about it on http://googlewebmastercentral.blogspot.de/2014/05/understanding-web-pages-better.html

But I have been too pleased too early. I turned off my pre-render service and let google crawl my site with the Webmaster Tools. After I look into the rendered HTML-Code and I found that:

<div ng-view></div>

So obviously, Google do not render ng-view correctly (hopefully at the moment). So I turn on my pre-render service and crawl the site again. And here is the second problem: Google do not translate the hashbang (#!) in the URL, which show google that AJAX content is on the website, automatically into ?_escaped_fragment_= . More information about AngularJS and SEO can be found here: http://www.yearofmoo.com/2012/11/angularjs-and-seo.html

What I know so far, all prerender services check for the ?_escaped_fragment_= string in the URL. If the string is present, the prerender service will server the html snapshot of the site. But Google don’t do it anymore. So in conclusion: At the moment sites with JS/AJAX content cannot be crawled by Google.

Did anyone had a similar experience with that? Is there may be some solution for that problem?

Eric Leschinski
  • 146,994
  • 96
  • 417
  • 335
tschiela
  • 5,231
  • 4
  • 28
  • 35

1 Answers1

0

to second your findings, I'm not seeing Google prerendering ajax content correctly - at least not in the webmastertools renderer. Google had been rendering ajax content correctly before, following their own "Make Ajax crawable" guidelines (more at: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started ) as well as in the search index as in the webmastertools service, fetching the content from "?_escaped_fragment_=".

As this seems to be a fault on Googles site, we won't find an answer here but to inform Google about this.

crnm
  • 476
  • 3
  • 9