If you have an AJAX based website, then the latest news from Google will be sure to put a smile on your face.
At the SMX East conference recently, Google announced a new plan to begin indexing AJAX.
This is big news because the search engines currently are unable to crawl and index AJAX-based web sites, which are basically a dead bolted door when a spider comes crawling.
Barry Schwartz over a Search Engine Roundtable explains the proposal in layman’s terms, which I have included below:
Google’s goals in creating this proposal include:
- Minimal changes are required as the website grows
- Users and search engines see the same content (no cloaking)
- Search engines can send users directly to the AJAX URL (not to a static copy)
- Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content
Although this announcement was only made a couple of days ago, there has been a lot of chatter already on forums across the internet. Here is what some people have to say about the proposal:
I can easily see this would benefit ALL search engines and many end users. Taking the user directly to content that’s only available in a modified page state would be a good thing. Many/most webmasters will never have need for AJAX. But for those who use it, making that content more crawlable is a very sane goal.
This proposal is as good as any I’ve heard so far. I’d also like to hear from Bing, Yahoo, Ask, Cuil, and some other players before this gets widely adopted.
It seems like Google are going down the Microsoft route of wanting the rest of the world to fit in with their standards rather than the proper accepted standards.
It’s tough at the top!
The official post from Google is here if you would like to read more about the proposal and we will continue to update you on any new developments.