I have a website hosted on a 1&1 shared server, and I'm looking to make my ajax-loaded content crawlable by Google bots. The site is set up for "hash-bang," but now I'm facing challenges with the escaped_fragment issue. I am considering installing HtmlUnit, Node.js or Zombie.js to assist Google in handling my javascript events.
However, I'm uncertain about where exactly to install this software on the server?
As I continue to gather information, it seems like I may need to switch to a virtual private server (VPS) in order to have access to the root or install necessary libraries... Is that correct?
The thought of increasing my annual hosting costs significantly just to improve Google's ability to index my content isn't very appealing to me... Do you know of any way to achieve crawlable ajax on a shared hosting platform?
If not, is there a possibility to host those required libraries elsewhere while keeping the current site setup as it is?
I'm searching for practical advice as Google doesn't provide much guidance on this matter, and most online resources don't delve deep into the hosting aspect. Thank you for your assistance!