Google announced today (may/28/2014), that JavaScript content will be rendered by google bot. WOW Great news! So there is no need to serve some pre-rendered pages for crawling purposes. Read more about it on http://googlewebmastercentral.blogspot.de/2014/05/understanding-web-pages-better.html
But I have been too pleased too early. I turned off my pre-render service and let google crawl my site with the Webmaster Tools. After I look into the rendered HTML-Code and I found that:
<div ng-view></div>
So obviously, Google do not render ng-view correctly (hopefully at the moment). So I turn on my pre-render service and crawl the site again. And here is the second problem: Google do not translate the hashbang (#!
) in the URL, which show google that AJAX content is on the website, automatically into ?_escaped_fragment_=
. More information about AngularJS and SEO can be found here: http://www.yearofmoo.com/2012/11/angularjs-and-seo.html
What I know so far, all prerender services check for the ?_escaped_fragment_=
string in the URL. If the string is present, the prerender service will server the html snapshot of the site. But Google don’t do it anymore. So in conclusion: At the moment sites with JS/AJAX content cannot be crawled by Google.
Did anyone had a similar experience with that? Is there may be some solution for that problem?