Exciting news from Google today (May 28, 2014) - JavaScript content will now be rendered by the Googlebot itself! This means there's no need to worry about serving pre-rendered pages just for crawling purposes. You can find out more details on this announcement here.
However, my initial excitement was short-lived. I disabled my pre-rendering service and allowed Google to crawl my site using Webmaster Tools. Upon inspecting the rendered HTML code, I came across:
<div ng-view></div>
It seems that Google isn't rendering the ng-view correctly (at least for now). So I re-enabled my pre-render service and had the site crawled again. But then I encountered a second issue: Google wasn't translating the hashbang (#!
) in the URL automatically into ?_escaped_fragment_=
, which indicates AJAX content on the website. More information on AngularJS and SEO can be found here.
From what I've gathered so far, all prerender services look for the ?_escaped_fragment_=
string in the URL. If it's present, the service serves up an HTML snapshot of the site. However, Google seems to have moved away from this approach. In essence, websites with JS/AJAX content may not be easily crawled by Google at the moment.
Has anyone else faced a similar situation? Are there any possible solutions to this problem?