Google SEO Advice On Ajax Coding

The Google Webmaster Central Blog has made an post on how to make your site crawlable if you make use of common Web 2.0 design features, such as AJAX. The advice includes having HTML links for your navigation, ensuring that your site works on browsers that do not have JavaScript turned on and using real […]

The Google Webmaster Central Blog has made an post on how to make your site crawlable if you make use of common Web 2.0 design features, such as AJAX. The advice includes having HTML links for your navigation, ensuring that your site works on browsers that do not have JavaScript turned on and using real links in your JavaScript or AJAX.

Here’s the post: Many webmasters have discovered the advantages of using Ajax to improve the user experience on their sites, creating dynamic pages that act as powerful web applications. But, like Flash, Ajax can make a site difficult for search engines to index if the technology is not implemented carefully. As promised in our post answering questions about Server location, cross-linking, and Web 2.0 technology, we've compiled some tips for creating Ajax-enhanced websites that are also understood by search engines.

How will Google see my site? One of the main issues with Ajax sites is that while Googlebot is great at following and understanding the structure of HTML links, it can have a difficult time finding its way around sites which use JavaScript for navigation. While we are working to better understand JavaScript, your best bet for creating a site that's crawlable by Google and other search engines is to provide HTML links to your content.

Design for accessibility: We encourage webmasters to create pages for users, not just search engines. When you're designing your Ajax site, think about the needs of your users, including those who may not be using a JavaScript-capable browser. There are plenty of such users on the web, including those using screen readers or mobile devices.

One of the easiest ways to test your site's accessibility to this type of user is to explore the site in your browser with JavaScript turned off, or by viewing it in a text-only browser such as Lynx. Viewing a site as text-only can also help you identify other content which may be hard for Googlebot to see, including images and Flash.

Develop with progressive enhancement: If you're starting from scratch, one good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with Ajax. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your Ajax bonuses.

Of course you will likely have links requiring JavaScript for Ajax functionality, so here's a way to help Ajax and static links coexist:

When creating your links, format them so they'll offer a static link as well as calling a JavaScript function. That way you'll have the Ajax functionality for JavaScript users, while non-JavaScript users can ignore the script and follow the link. For example:

<a href=”ajax.htm?foo=32” onClick=”navigate('ajax.html#foo=32'); return false”>foo 32</a>

Note that the static link's URL has a parameter (?foo=32) instead of a fragment (#foo=32), which is used by the Ajax code. This is important, as search engines understand URL parameters but often ignore fragments. Web developer Jeremy Keith labeled this technique as Hijax. Since you now offer static links, users and search engines can link to the exact content they want to share or reference.

While we're constantly improving our crawling capability, using HTML links remains a strong way to help us (as well as other search engines, mobile devices and users) better understand your site's structure.

Full Article

Google, GWC, Aajx, Web 2.0, Search Engine, Bot, Spider, SEO, Crawling, Indexing, Tips and Tricks